Oct 07 08:16:32 crc systemd[1]: Starting Kubernetes Kubelet... Oct 07 08:16:32 crc restorecon[4722]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:32 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 08:16:33 crc restorecon[4722]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 07 08:16:33 crc kubenswrapper[5025]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 08:16:33 crc kubenswrapper[5025]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 07 08:16:33 crc kubenswrapper[5025]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 08:16:33 crc kubenswrapper[5025]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 08:16:33 crc kubenswrapper[5025]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 07 08:16:33 crc kubenswrapper[5025]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.641199 5025 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.652994 5025 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653050 5025 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653067 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653082 5025 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653099 5025 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653113 5025 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653127 5025 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653139 5025 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653150 5025 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653161 5025 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653171 5025 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653180 5025 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653190 5025 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653199 5025 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653208 5025 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653217 5025 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653226 5025 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653235 5025 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653244 5025 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653253 5025 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653263 5025 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653282 5025 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653293 5025 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653302 5025 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653310 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653320 5025 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653331 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653341 5025 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653351 5025 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653361 5025 feature_gate.go:330] unrecognized feature gate: Example Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653371 5025 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653383 5025 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653395 5025 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653405 5025 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653414 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653424 5025 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653434 5025 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653443 5025 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653452 5025 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653463 5025 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653472 5025 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653484 5025 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653497 5025 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653508 5025 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653520 5025 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653529 5025 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653545 5025 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653583 5025 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653593 5025 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653604 5025 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653613 5025 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653622 5025 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653631 5025 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653640 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653650 5025 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653659 5025 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653668 5025 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653678 5025 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653690 5025 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653701 5025 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653711 5025 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653721 5025 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653733 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653742 5025 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653751 5025 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653760 5025 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653769 5025 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653778 5025 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653787 5025 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653796 5025 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.653804 5025 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.654883 5025 flags.go:64] FLAG: --address="0.0.0.0" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.654924 5025 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.654946 5025 flags.go:64] FLAG: --anonymous-auth="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.654963 5025 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.654979 5025 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.654994 5025 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655014 5025 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655047 5025 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655061 5025 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655074 5025 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655088 5025 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655106 5025 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655120 5025 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655132 5025 flags.go:64] FLAG: --cgroup-root="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655145 5025 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655158 5025 flags.go:64] FLAG: --client-ca-file="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655170 5025 flags.go:64] FLAG: --cloud-config="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655183 5025 flags.go:64] FLAG: --cloud-provider="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655195 5025 flags.go:64] FLAG: --cluster-dns="[]" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655211 5025 flags.go:64] FLAG: --cluster-domain="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655223 5025 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655237 5025 flags.go:64] FLAG: --config-dir="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655250 5025 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655268 5025 flags.go:64] FLAG: --container-log-max-files="5" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655286 5025 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655298 5025 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655311 5025 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655324 5025 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655338 5025 flags.go:64] FLAG: --contention-profiling="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655355 5025 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655368 5025 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655381 5025 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655394 5025 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655410 5025 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655423 5025 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655436 5025 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655448 5025 flags.go:64] FLAG: --enable-load-reader="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655462 5025 flags.go:64] FLAG: --enable-server="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655474 5025 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655491 5025 flags.go:64] FLAG: --event-burst="100" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655505 5025 flags.go:64] FLAG: --event-qps="50" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655519 5025 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655532 5025 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655591 5025 flags.go:64] FLAG: --eviction-hard="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655608 5025 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655621 5025 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655634 5025 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655651 5025 flags.go:64] FLAG: --eviction-soft="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655664 5025 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655677 5025 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655690 5025 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655703 5025 flags.go:64] FLAG: --experimental-mounter-path="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655715 5025 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655728 5025 flags.go:64] FLAG: --fail-swap-on="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655741 5025 flags.go:64] FLAG: --feature-gates="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655756 5025 flags.go:64] FLAG: --file-check-frequency="20s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655769 5025 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655782 5025 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655797 5025 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655810 5025 flags.go:64] FLAG: --healthz-port="10248" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655824 5025 flags.go:64] FLAG: --help="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655836 5025 flags.go:64] FLAG: --hostname-override="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655848 5025 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655862 5025 flags.go:64] FLAG: --http-check-frequency="20s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655875 5025 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655888 5025 flags.go:64] FLAG: --image-credential-provider-config="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655901 5025 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655915 5025 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655927 5025 flags.go:64] FLAG: --image-service-endpoint="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655939 5025 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655952 5025 flags.go:64] FLAG: --kube-api-burst="100" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655965 5025 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655979 5025 flags.go:64] FLAG: --kube-api-qps="50" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.655992 5025 flags.go:64] FLAG: --kube-reserved="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656008 5025 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656024 5025 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656039 5025 flags.go:64] FLAG: --kubelet-cgroups="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656052 5025 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656066 5025 flags.go:64] FLAG: --lock-file="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656079 5025 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656092 5025 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656105 5025 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656125 5025 flags.go:64] FLAG: --log-json-split-stream="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656141 5025 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656154 5025 flags.go:64] FLAG: --log-text-split-stream="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656168 5025 flags.go:64] FLAG: --logging-format="text" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656180 5025 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656194 5025 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656206 5025 flags.go:64] FLAG: --manifest-url="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656220 5025 flags.go:64] FLAG: --manifest-url-header="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656237 5025 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656251 5025 flags.go:64] FLAG: --max-open-files="1000000" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656267 5025 flags.go:64] FLAG: --max-pods="110" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656279 5025 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656292 5025 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656305 5025 flags.go:64] FLAG: --memory-manager-policy="None" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656318 5025 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656332 5025 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656345 5025 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656358 5025 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656390 5025 flags.go:64] FLAG: --node-status-max-images="50" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656403 5025 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656416 5025 flags.go:64] FLAG: --oom-score-adj="-999" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656430 5025 flags.go:64] FLAG: --pod-cidr="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656444 5025 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656465 5025 flags.go:64] FLAG: --pod-manifest-path="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656480 5025 flags.go:64] FLAG: --pod-max-pids="-1" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656495 5025 flags.go:64] FLAG: --pods-per-core="0" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656509 5025 flags.go:64] FLAG: --port="10250" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656523 5025 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656536 5025 flags.go:64] FLAG: --provider-id="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656595 5025 flags.go:64] FLAG: --qos-reserved="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656609 5025 flags.go:64] FLAG: --read-only-port="10255" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656622 5025 flags.go:64] FLAG: --register-node="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656635 5025 flags.go:64] FLAG: --register-schedulable="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656648 5025 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656671 5025 flags.go:64] FLAG: --registry-burst="10" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656684 5025 flags.go:64] FLAG: --registry-qps="5" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656697 5025 flags.go:64] FLAG: --reserved-cpus="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656713 5025 flags.go:64] FLAG: --reserved-memory="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656732 5025 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656745 5025 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656758 5025 flags.go:64] FLAG: --rotate-certificates="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656770 5025 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656780 5025 flags.go:64] FLAG: --runonce="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656791 5025 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656801 5025 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656812 5025 flags.go:64] FLAG: --seccomp-default="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656822 5025 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656832 5025 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656843 5025 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656853 5025 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656864 5025 flags.go:64] FLAG: --storage-driver-password="root" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656877 5025 flags.go:64] FLAG: --storage-driver-secure="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656889 5025 flags.go:64] FLAG: --storage-driver-table="stats" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656900 5025 flags.go:64] FLAG: --storage-driver-user="root" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656911 5025 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656923 5025 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656934 5025 flags.go:64] FLAG: --system-cgroups="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656944 5025 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656960 5025 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656971 5025 flags.go:64] FLAG: --tls-cert-file="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656980 5025 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.656994 5025 flags.go:64] FLAG: --tls-min-version="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.657004 5025 flags.go:64] FLAG: --tls-private-key-file="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.657014 5025 flags.go:64] FLAG: --topology-manager-policy="none" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.657024 5025 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.657034 5025 flags.go:64] FLAG: --topology-manager-scope="container" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.657044 5025 flags.go:64] FLAG: --v="2" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.657057 5025 flags.go:64] FLAG: --version="false" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.657079 5025 flags.go:64] FLAG: --vmodule="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.657091 5025 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.657102 5025 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657330 5025 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657344 5025 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657357 5025 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657369 5025 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657382 5025 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657393 5025 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657402 5025 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657411 5025 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657423 5025 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657435 5025 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657446 5025 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657456 5025 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657466 5025 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657475 5025 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657486 5025 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657495 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657504 5025 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657513 5025 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657522 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657532 5025 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657573 5025 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657583 5025 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657592 5025 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657600 5025 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657609 5025 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657617 5025 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657626 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657636 5025 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657647 5025 feature_gate.go:330] unrecognized feature gate: Example Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657657 5025 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657668 5025 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657679 5025 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657690 5025 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657702 5025 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657713 5025 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657723 5025 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657732 5025 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657740 5025 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657751 5025 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657761 5025 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657771 5025 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657779 5025 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657788 5025 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657797 5025 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657807 5025 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657816 5025 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657825 5025 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657834 5025 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657842 5025 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657852 5025 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657861 5025 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657870 5025 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657878 5025 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657887 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657896 5025 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657909 5025 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657920 5025 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657930 5025 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657940 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657949 5025 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657958 5025 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657967 5025 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657976 5025 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657985 5025 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.657994 5025 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.658003 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.658012 5025 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.658020 5025 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.658029 5025 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.658037 5025 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.658046 5025 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.658983 5025 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.672595 5025 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.672648 5025 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672777 5025 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672792 5025 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672803 5025 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672812 5025 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672823 5025 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672833 5025 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672841 5025 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672850 5025 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672859 5025 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672868 5025 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672876 5025 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672888 5025 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672901 5025 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672913 5025 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672924 5025 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672936 5025 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672948 5025 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672960 5025 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672970 5025 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672979 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672988 5025 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.672997 5025 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673006 5025 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673015 5025 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673024 5025 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673032 5025 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673041 5025 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673050 5025 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673058 5025 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673066 5025 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673074 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673083 5025 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673092 5025 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673101 5025 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673110 5025 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673119 5025 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673128 5025 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673140 5025 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673151 5025 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673160 5025 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673170 5025 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673179 5025 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673187 5025 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673196 5025 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673205 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673213 5025 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673222 5025 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673230 5025 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673240 5025 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673249 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673257 5025 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673265 5025 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673273 5025 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673282 5025 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673291 5025 feature_gate.go:330] unrecognized feature gate: Example Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673299 5025 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673308 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673316 5025 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673324 5025 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673333 5025 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673342 5025 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673350 5025 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673358 5025 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673367 5025 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673376 5025 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673384 5025 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673396 5025 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673405 5025 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673414 5025 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673424 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673434 5025 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.673448 5025 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673710 5025 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673726 5025 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673738 5025 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673749 5025 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673758 5025 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673766 5025 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673775 5025 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673784 5025 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673793 5025 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673803 5025 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673811 5025 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673820 5025 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673831 5025 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673842 5025 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673851 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673860 5025 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673870 5025 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673879 5025 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673887 5025 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673896 5025 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673904 5025 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673913 5025 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673921 5025 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673932 5025 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673941 5025 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673949 5025 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673958 5025 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673966 5025 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673975 5025 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673983 5025 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.673991 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674000 5025 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674009 5025 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674018 5025 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674027 5025 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674036 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674044 5025 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674052 5025 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674061 5025 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674069 5025 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674078 5025 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674087 5025 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674095 5025 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674104 5025 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674113 5025 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674121 5025 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674132 5025 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674141 5025 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674152 5025 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674164 5025 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674174 5025 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674184 5025 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674193 5025 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674201 5025 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674210 5025 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674219 5025 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674227 5025 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674236 5025 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674244 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674253 5025 feature_gate.go:330] unrecognized feature gate: Example Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674261 5025 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674269 5025 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674278 5025 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674286 5025 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674295 5025 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674303 5025 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674312 5025 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674320 5025 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674328 5025 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674337 5025 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.674347 5025 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.674360 5025 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.674682 5025 server.go:940] "Client rotation is on, will bootstrap in background" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.681179 5025 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.681354 5025 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.683696 5025 server.go:997] "Starting client certificate rotation" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.683749 5025 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.683963 5025 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-04 14:17:03.256832703 +0000 UTC Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.684086 5025 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1398h0m29.572749186s for next certificate rotation Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.706437 5025 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.708398 5025 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.728751 5025 log.go:25] "Validated CRI v1 runtime API" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.758599 5025 log.go:25] "Validated CRI v1 image API" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.760723 5025 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.769726 5025 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-07-08-12-01-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.770052 5025 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.806241 5025 manager.go:217] Machine: {Timestamp:2025-10-07 08:16:33.79999721 +0000 UTC m=+0.609311434 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799886 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:18315730-39a6-4b53-82b9-587e1e3a7adc BootID:15feb688-c1d9-4e7b-b633-9a128b7afc98 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:db:7f:35 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:db:7f:35 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a8:e8:50 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:26:c4:02 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:47:7d:ec Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:16:2a:61 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:be:28:b8 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b2:02:a7:2c:7a:16 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:62:e5:d1:af:b3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.807230 5025 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.807721 5025 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.809710 5025 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.810067 5025 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.810132 5025 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.810511 5025 topology_manager.go:138] "Creating topology manager with none policy" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.810530 5025 container_manager_linux.go:303] "Creating device plugin manager" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.811216 5025 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.811270 5025 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.812275 5025 state_mem.go:36] "Initialized new in-memory state store" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.812438 5025 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.817393 5025 kubelet.go:418] "Attempting to sync node with API server" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.817437 5025 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.817490 5025 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.817514 5025 kubelet.go:324] "Adding apiserver pod source" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.817545 5025 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.826204 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.826208 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:33 crc kubenswrapper[5025]: E1007 08:16:33.826458 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:33 crc kubenswrapper[5025]: E1007 08:16:33.826499 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.828777 5025 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.830493 5025 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.832371 5025 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.834726 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.834889 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.834999 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.835098 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.835228 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.835333 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.835442 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.835588 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.835701 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.835802 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.835907 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.836019 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.837097 5025 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.838007 5025 server.go:1280] "Started kubelet" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.842486 5025 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.843060 5025 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.843208 5025 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 07 08:16:33 crc systemd[1]: Started Kubernetes Kubelet. Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.845214 5025 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.846555 5025 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.846646 5025 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.854439 5025 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 11:18:46.449085124 +0000 UTC Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.854525 5025 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1251h2m12.594564832s for next certificate rotation Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.855518 5025 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.855571 5025 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.855711 5025 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.856790 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:33 crc kubenswrapper[5025]: E1007 08:16:33.856899 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.857708 5025 server.go:460] "Adding debug handlers to kubelet server" Oct 07 08:16:33 crc kubenswrapper[5025]: E1007 08:16:33.858042 5025 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="200ms" Oct 07 08:16:33 crc kubenswrapper[5025]: E1007 08:16:33.858220 5025 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.861084 5025 factory.go:153] Registering CRI-O factory Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.861137 5025 factory.go:221] Registration of the crio container factory successfully Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.861283 5025 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.861317 5025 factory.go:55] Registering systemd factory Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.861336 5025 factory.go:221] Registration of the systemd container factory successfully Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.861394 5025 factory.go:103] Registering Raw factory Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.861442 5025 manager.go:1196] Started watching for new ooms in manager Oct 07 08:16:33 crc kubenswrapper[5025]: E1007 08:16:33.861071 5025 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.173:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c27820890ccef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 08:16:33.837960431 +0000 UTC m=+0.647274615,LastTimestamp:2025-10-07 08:16:33.837960431 +0000 UTC m=+0.647274615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.862746 5025 manager.go:319] Starting recovery of all containers Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866279 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866343 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866362 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866375 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866388 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866401 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866415 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866435 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866456 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866473 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866488 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866506 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866521 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866544 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866601 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866619 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866633 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866648 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866661 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866674 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866707 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866719 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866733 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866749 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866761 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866774 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866985 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.866997 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867010 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867023 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867036 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867047 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867059 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867072 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867086 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867099 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867112 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867124 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867136 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867150 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867165 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867178 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867192 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867217 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867231 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867246 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867260 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867273 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867287 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867300 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867315 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867328 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867346 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867359 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867373 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867408 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867422 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867435 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867450 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867465 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867477 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867491 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867506 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867519 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867533 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867564 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867579 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867592 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867606 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867620 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867637 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867650 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867700 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867717 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867729 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867745 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867759 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867811 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867829 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867843 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867905 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867920 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867956 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867970 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.867987 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868019 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868036 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868049 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868062 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868096 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868110 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868145 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868159 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868193 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868208 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868242 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868257 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868274 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868288 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868300 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868314 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868349 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868362 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868397 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868421 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868484 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868521 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868582 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868599 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868643 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868657 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868671 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868684 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868698 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.868713 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.870544 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.870637 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878313 5025 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878403 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878432 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878465 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878484 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878521 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878601 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878627 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878654 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878671 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878698 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878718 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878737 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878760 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878778 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878798 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878838 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878858 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878883 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878909 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878931 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878961 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.878980 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879004 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879022 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879041 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879065 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879083 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879109 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879131 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879148 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879171 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879198 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879226 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879243 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879262 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879282 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879301 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879319 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879345 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879364 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879392 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879412 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879431 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879458 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879478 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879505 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879525 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879565 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879596 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879618 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879649 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879672 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879698 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879727 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879746 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879773 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879794 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879816 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879842 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879862 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879892 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879912 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879933 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879961 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879979 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.879999 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880027 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880047 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880077 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880102 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880121 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880150 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880185 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880218 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880236 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880256 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880289 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880309 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880336 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880357 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880377 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880410 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880429 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880452 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880471 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880488 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880511 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880528 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880572 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880595 5025 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880611 5025 reconstruct.go:97] "Volume reconstruction finished" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.880623 5025 reconciler.go:26] "Reconciler: start to sync state" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.893313 5025 manager.go:324] Recovery completed Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.909936 5025 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.911026 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.913204 5025 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.913271 5025 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.913308 5025 kubelet.go:2335] "Starting kubelet main sync loop" Oct 07 08:16:33 crc kubenswrapper[5025]: E1007 08:16:33.913457 5025 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.913304 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.913540 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.913579 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:33 crc kubenswrapper[5025]: W1007 08:16:33.914303 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:33 crc kubenswrapper[5025]: E1007 08:16:33.914378 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.914679 5025 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.914717 5025 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.914746 5025 state_mem.go:36] "Initialized new in-memory state store" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.936002 5025 policy_none.go:49] "None policy: Start" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.937228 5025 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.937267 5025 state_mem.go:35] "Initializing new in-memory state store" Oct 07 08:16:33 crc kubenswrapper[5025]: E1007 08:16:33.959075 5025 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.991786 5025 manager.go:334] "Starting Device Plugin manager" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.991860 5025 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.991882 5025 server.go:79] "Starting device plugin registration server" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.992497 5025 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.992529 5025 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.992861 5025 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.992974 5025 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 07 08:16:33 crc kubenswrapper[5025]: I1007 08:16:33.992982 5025 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 07 08:16:34 crc kubenswrapper[5025]: E1007 08:16:34.002981 5025 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.014262 5025 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.014447 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.019402 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.019490 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.019510 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.020176 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.020580 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.020644 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.021763 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.021793 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.021803 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.021934 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.022221 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.022311 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023205 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023234 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023247 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023285 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023316 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023327 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023294 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023382 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023403 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023591 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023670 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.023720 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.024567 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.024600 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.024611 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.024761 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.025052 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.025214 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.025423 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.025444 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.025456 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.025848 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.026181 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.026200 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.026363 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.026385 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.027048 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.027069 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.027081 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.027427 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.027448 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.027456 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: E1007 08:16:34.059302 5025 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="400ms" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.086566 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.086643 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.086672 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.086722 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.086749 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.086837 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.086917 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.086947 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.086980 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.087002 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.087020 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.087037 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.087055 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.087076 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.087167 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.093552 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.094823 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.094863 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.094878 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.094910 5025 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 08:16:34 crc kubenswrapper[5025]: E1007 08:16:34.095367 5025 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.188962 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189372 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189413 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189447 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189199 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189526 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189451 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189617 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189660 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189683 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189709 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189730 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189746 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189731 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189764 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189805 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189807 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189836 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189845 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189879 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189936 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.189903 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.190029 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.190084 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.190154 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.190182 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.190274 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.190282 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.190174 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.190321 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.296215 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.299274 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.299328 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.299339 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.299376 5025 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 08:16:34 crc kubenswrapper[5025]: E1007 08:16:34.300119 5025 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.366442 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.385112 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.399657 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: W1007 08:16:34.404804 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-3f09d770ed315446f21d042bd1dd87064915e9bc48c5fe5a165a92602ea7fde0 WatchSource:0}: Error finding container 3f09d770ed315446f21d042bd1dd87064915e9bc48c5fe5a165a92602ea7fde0: Status 404 returned error can't find the container with id 3f09d770ed315446f21d042bd1dd87064915e9bc48c5fe5a165a92602ea7fde0 Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.408268 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.411256 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:16:34 crc kubenswrapper[5025]: W1007 08:16:34.433653 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9de176c745408415d7d0997455ced936cd16ee1e5488ecbe2e9f99a698f05fad WatchSource:0}: Error finding container 9de176c745408415d7d0997455ced936cd16ee1e5488ecbe2e9f99a698f05fad: Status 404 returned error can't find the container with id 9de176c745408415d7d0997455ced936cd16ee1e5488ecbe2e9f99a698f05fad Oct 07 08:16:34 crc kubenswrapper[5025]: W1007 08:16:34.441219 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-8ab28dd11c9712e8ed69410f8b7f2f9a9219e0847248d41533a0ff394affc965 WatchSource:0}: Error finding container 8ab28dd11c9712e8ed69410f8b7f2f9a9219e0847248d41533a0ff394affc965: Status 404 returned error can't find the container with id 8ab28dd11c9712e8ed69410f8b7f2f9a9219e0847248d41533a0ff394affc965 Oct 07 08:16:34 crc kubenswrapper[5025]: E1007 08:16:34.461102 5025 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="800ms" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.700995 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.703089 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.703145 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.703158 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.703191 5025 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 08:16:34 crc kubenswrapper[5025]: E1007 08:16:34.703827 5025 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 07 08:16:34 crc kubenswrapper[5025]: W1007 08:16:34.812939 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:34 crc kubenswrapper[5025]: E1007 08:16:34.813092 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.844916 5025 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:34 crc kubenswrapper[5025]: W1007 08:16:34.845521 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:34 crc kubenswrapper[5025]: E1007 08:16:34.845705 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.919014 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a036658af22ba501845d084d78751054c54110e0be59bba896e4ed5c95709c35"} Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.922926 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6d6ab3594e74f0539b74ddce76f378a1735c0265d81a2b6530163f6d4524313"} Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.926052 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3f09d770ed315446f21d042bd1dd87064915e9bc48c5fe5a165a92602ea7fde0"} Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.927651 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8ab28dd11c9712e8ed69410f8b7f2f9a9219e0847248d41533a0ff394affc965"} Oct 07 08:16:34 crc kubenswrapper[5025]: I1007 08:16:34.928842 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9de176c745408415d7d0997455ced936cd16ee1e5488ecbe2e9f99a698f05fad"} Oct 07 08:16:35 crc kubenswrapper[5025]: W1007 08:16:35.142659 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:35 crc kubenswrapper[5025]: E1007 08:16:35.143170 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:35 crc kubenswrapper[5025]: W1007 08:16:35.217003 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:35 crc kubenswrapper[5025]: E1007 08:16:35.217154 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:35 crc kubenswrapper[5025]: E1007 08:16:35.262806 5025 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="1.6s" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.504774 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.507177 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.507237 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.507257 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.507300 5025 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 08:16:35 crc kubenswrapper[5025]: E1007 08:16:35.508137 5025 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.843312 5025 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.935811 5025 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79" exitCode=0 Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.935983 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.935981 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79"} Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.937017 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.937064 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.937079 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.939071 5025 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140" exitCode=0 Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.939140 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.939153 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140"} Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.940158 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.940200 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.940216 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.943483 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.943503 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96"} Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.943587 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3"} Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.943603 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d"} Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.943620 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08"} Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.944587 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.944618 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.944629 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.947432 5025 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40" exitCode=0 Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.947602 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.948099 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40"} Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.949238 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.949270 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.949281 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.957794 5025 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5097cc8364707dbab8b1e64c98e38f6ca4d2ca6b09bc40b340c95a78d575d107" exitCode=0 Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.957922 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5097cc8364707dbab8b1e64c98e38f6ca4d2ca6b09bc40b340c95a78d575d107"} Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.958098 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.960176 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.960237 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.960256 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.961633 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.963276 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.963306 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:35 crc kubenswrapper[5025]: I1007 08:16:35.963317 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.213513 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.845057 5025 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:36 crc kubenswrapper[5025]: E1007 08:16:36.863798 5025 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="3.2s" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.964384 5025 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ee92939de135b9c7694cfc5a131e2be7a644b4388051f786fef00c10a285b317" exitCode=0 Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.964481 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ee92939de135b9c7694cfc5a131e2be7a644b4388051f786fef00c10a285b317"} Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.964592 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.965942 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.965983 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.965995 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.973828 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e"} Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.973902 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06"} Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.974128 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee"} Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.974288 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.976505 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.976580 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.976596 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.981141 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1a35a698f6fc982aef52becc7aaf63bb0cca25bf9dab6f23bcbd51712b3dbc89"} Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.981251 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.983890 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.983923 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.983934 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.990131 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702"} Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.990205 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750"} Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.990224 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662"} Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.990242 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594"} Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.990260 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.993980 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.994038 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:36 crc kubenswrapper[5025]: I1007 08:16:36.994059 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:37 crc kubenswrapper[5025]: W1007 08:16:37.086928 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:37 crc kubenswrapper[5025]: E1007 08:16:37.087004 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.109152 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.110487 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.110526 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.110542 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.110600 5025 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 08:16:37 crc kubenswrapper[5025]: E1007 08:16:37.111221 5025 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 07 08:16:37 crc kubenswrapper[5025]: W1007 08:16:37.331328 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:37 crc kubenswrapper[5025]: E1007 08:16:37.331425 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:37 crc kubenswrapper[5025]: W1007 08:16:37.437928 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 07 08:16:37 crc kubenswrapper[5025]: E1007 08:16:37.438013 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.997138 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa"} Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.997176 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.998628 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.998685 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:37 crc kubenswrapper[5025]: I1007 08:16:37.998710 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.000629 5025 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="660f67e0730fe6a0c6a2639e066ce18a9e221ba70404893763d0e60faf147b10" exitCode=0 Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.000797 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.000840 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.000872 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.000826 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"660f67e0730fe6a0c6a2639e066ce18a9e221ba70404893763d0e60faf147b10"} Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.000972 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002158 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002431 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002463 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002471 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002817 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002856 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002903 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002910 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002950 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.002962 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.003414 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.003461 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:38 crc kubenswrapper[5025]: I1007 08:16:38.003481 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.006501 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2c65382b2ab640a58d09f36e6579a7e94bae0b9f16a675d66740226aaf2cc31b"} Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.006577 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1001623b4801160d726efa289ef774cfd923daa67eb0bb261e40cfde1064cda7"} Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.006592 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2ddc6bc80726e8911355757ec683507663b96ba7a2d4f5c5f57d9f7b3292dd91"} Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.006624 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.006682 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.008220 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.008253 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.008268 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.022202 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.039617 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.039776 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.041022 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.041084 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.041111 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.326791 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:39 crc kubenswrapper[5025]: I1007 08:16:39.917976 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.016746 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"850f4333eed8df71ad0a544dddc6f27c300e530b4bce84dbba7287280d4bde4f"} Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.017606 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"af26657a576dc304410891b4adb5fb9b9e1fa162746eb718e94041fd3708e1f7"} Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.016966 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.016807 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.017741 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.017038 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.018612 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.018640 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.018650 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.019713 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.019760 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.019773 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.019872 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.019890 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.019900 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.311935 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.314006 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.314100 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.314162 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:40 crc kubenswrapper[5025]: I1007 08:16:40.314234 5025 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.020418 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.020488 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.020598 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.022129 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.022163 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.022178 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.022375 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.022450 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.022471 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.277046 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.842301 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.842589 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.844243 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.844318 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.844336 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:41 crc kubenswrapper[5025]: I1007 08:16:41.918968 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.027940 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.030021 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.030074 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.030093 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.282352 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.282698 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.284202 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.284235 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.284247 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.918097 5025 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 08:16:42 crc kubenswrapper[5025]: I1007 08:16:42.918220 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 08:16:43 crc kubenswrapper[5025]: I1007 08:16:43.030814 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:43 crc kubenswrapper[5025]: I1007 08:16:43.032661 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:43 crc kubenswrapper[5025]: I1007 08:16:43.032764 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:43 crc kubenswrapper[5025]: I1007 08:16:43.032793 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:44 crc kubenswrapper[5025]: E1007 08:16:44.003236 5025 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 08:16:44 crc kubenswrapper[5025]: I1007 08:16:44.510310 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:44 crc kubenswrapper[5025]: I1007 08:16:44.510609 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:44 crc kubenswrapper[5025]: I1007 08:16:44.512330 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:44 crc kubenswrapper[5025]: I1007 08:16:44.512405 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:44 crc kubenswrapper[5025]: I1007 08:16:44.512419 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:44 crc kubenswrapper[5025]: I1007 08:16:44.517752 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:45 crc kubenswrapper[5025]: I1007 08:16:45.036675 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:45 crc kubenswrapper[5025]: I1007 08:16:45.038635 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:45 crc kubenswrapper[5025]: I1007 08:16:45.038755 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:45 crc kubenswrapper[5025]: I1007 08:16:45.038828 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:45 crc kubenswrapper[5025]: I1007 08:16:45.044306 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:46 crc kubenswrapper[5025]: I1007 08:16:46.039752 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:46 crc kubenswrapper[5025]: I1007 08:16:46.041112 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:46 crc kubenswrapper[5025]: I1007 08:16:46.041168 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:46 crc kubenswrapper[5025]: I1007 08:16:46.041189 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:47 crc kubenswrapper[5025]: W1007 08:16:47.703136 5025 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 08:16:47 crc kubenswrapper[5025]: I1007 08:16:47.703289 5025 trace.go:236] Trace[2004792053]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 08:16:37.701) (total time: 10001ms): Oct 07 08:16:47 crc kubenswrapper[5025]: Trace[2004792053]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (08:16:47.703) Oct 07 08:16:47 crc kubenswrapper[5025]: Trace[2004792053]: [10.001465067s] [10.001465067s] END Oct 07 08:16:47 crc kubenswrapper[5025]: E1007 08:16:47.703327 5025 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 08:16:47 crc kubenswrapper[5025]: I1007 08:16:47.844886 5025 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 07 08:16:48 crc kubenswrapper[5025]: I1007 08:16:48.177301 5025 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Oct 07 08:16:48 crc kubenswrapper[5025]: I1007 08:16:48.177401 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 08:16:48 crc kubenswrapper[5025]: I1007 08:16:48.181616 5025 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Oct 07 08:16:48 crc kubenswrapper[5025]: I1007 08:16:48.181697 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 08:16:49 crc kubenswrapper[5025]: I1007 08:16:49.029337 5025 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]log ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]etcd ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/generic-apiserver-start-informers ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/priority-and-fairness-filter ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-apiextensions-informers ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-apiextensions-controllers ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/crd-informer-synced ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-system-namespaces-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 07 08:16:49 crc kubenswrapper[5025]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 07 08:16:49 crc kubenswrapper[5025]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/bootstrap-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/start-kube-aggregator-informers ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/apiservice-registration-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/apiservice-discovery-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]autoregister-completion ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/apiservice-openapi-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 07 08:16:49 crc kubenswrapper[5025]: livez check failed Oct 07 08:16:49 crc kubenswrapper[5025]: I1007 08:16:49.029410 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:16:51 crc kubenswrapper[5025]: I1007 08:16:51.852525 5025 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 07 08:16:51 crc kubenswrapper[5025]: I1007 08:16:51.952879 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 07 08:16:51 crc kubenswrapper[5025]: I1007 08:16:51.953113 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:51 crc kubenswrapper[5025]: I1007 08:16:51.954764 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:51 crc kubenswrapper[5025]: I1007 08:16:51.954832 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:51 crc kubenswrapper[5025]: I1007 08:16:51.954852 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:51 crc kubenswrapper[5025]: I1007 08:16:51.971054 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 07 08:16:52 crc kubenswrapper[5025]: I1007 08:16:52.057173 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:52 crc kubenswrapper[5025]: I1007 08:16:52.058991 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:52 crc kubenswrapper[5025]: I1007 08:16:52.059058 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:52 crc kubenswrapper[5025]: I1007 08:16:52.059090 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:52 crc kubenswrapper[5025]: I1007 08:16:52.918777 5025 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 08:16:52 crc kubenswrapper[5025]: I1007 08:16:52.918891 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.163358 5025 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.164956 5025 trace.go:236] Trace[1635445690]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 08:16:41.337) (total time: 11827ms): Oct 07 08:16:53 crc kubenswrapper[5025]: Trace[1635445690]: ---"Objects listed" error: 11827ms (08:16:53.164) Oct 07 08:16:53 crc kubenswrapper[5025]: Trace[1635445690]: [11.827398541s] [11.827398541s] END Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.164984 5025 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.166517 5025 trace.go:236] Trace[1188423365]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 08:16:42.065) (total time: 11100ms): Oct 07 08:16:53 crc kubenswrapper[5025]: Trace[1188423365]: ---"Objects listed" error: 11100ms (08:16:53.166) Oct 07 08:16:53 crc kubenswrapper[5025]: Trace[1188423365]: [11.100523745s] [11.100523745s] END Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.166902 5025 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.166735 5025 trace.go:236] Trace[1594971017]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 08:16:43.139) (total time: 10027ms): Oct 07 08:16:53 crc kubenswrapper[5025]: Trace[1594971017]: ---"Objects listed" error: 10027ms (08:16:53.166) Oct 07 08:16:53 crc kubenswrapper[5025]: Trace[1594971017]: [10.027354338s] [10.027354338s] END Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.167058 5025 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.167070 5025 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.168755 5025 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.220399 5025 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44410->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.220486 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44410->192.168.126.11:17697: read: connection reset by peer" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.220379 5025 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44400->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.220609 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44400->192.168.126.11:17697: read: connection reset by peer" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.828865 5025 apiserver.go:52] "Watching apiserver" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.835890 5025 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.836388 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.837160 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.837457 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.837578 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.837683 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.837847 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.838299 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.838393 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.839732 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.839806 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.840496 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.840867 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.841019 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.841088 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.841278 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.841616 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.842317 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.842634 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.843365 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.856863 5025 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.865671 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873668 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873722 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873748 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873777 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873801 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873825 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873847 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873870 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873895 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873920 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873943 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873967 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.873988 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874009 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874034 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874056 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874080 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874106 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874129 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874155 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874186 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874209 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874233 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874254 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874276 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874301 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874323 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874343 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874364 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874386 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874422 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874496 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874144 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874533 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874593 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874190 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874188 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874629 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874666 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874697 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874734 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874770 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874799 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874826 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874861 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874891 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874917 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874944 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874211 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875019 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874471 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874528 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875052 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874690 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874736 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874761 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.874785 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875003 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875056 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875170 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875196 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875217 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875296 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875315 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875331 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875347 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875363 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875384 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875404 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875425 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875447 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875465 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875483 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875503 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875522 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875556 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875582 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875600 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875618 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875637 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875653 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875668 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875685 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875702 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875727 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875745 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875766 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875783 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875800 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875816 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875834 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875850 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875872 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875889 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875910 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875928 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875946 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875962 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875979 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875998 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876015 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876032 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876048 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876067 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876084 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876101 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876118 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876136 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876156 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876176 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876195 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876211 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876230 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876254 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876271 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876288 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876307 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875110 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875370 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875384 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875397 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875676 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875686 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875728 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875895 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875943 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.875997 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876217 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876485 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876499 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876585 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876803 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876810 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876855 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.877039 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.877044 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876276 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876351 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.877197 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.877442 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.877455 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.877590 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.877682 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.877739 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.878093 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876224 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.878382 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.878403 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.878421 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.879431 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.879445 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.879476 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.880246 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:16:54.380218589 +0000 UTC m=+21.189532753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.880806 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.880914 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.881004 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.881084 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.881273 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.881291 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.881297 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.881383 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.882481 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.882889 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.883019 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.883214 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.883213 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.876360 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.883957 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884017 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884065 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884101 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884144 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884180 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884219 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884255 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884296 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884366 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884407 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884443 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884479 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884517 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884577 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884790 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884818 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884844 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884873 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884899 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884926 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884951 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884976 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885002 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885026 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885052 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885076 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885104 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885130 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885152 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885176 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885197 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885225 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885252 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885274 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885297 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885320 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885346 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885369 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885393 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885419 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885450 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885486 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885518 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885579 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885607 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885632 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885660 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885683 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884022 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885708 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884219 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.884262 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885736 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885800 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885835 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885865 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885901 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885930 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885954 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.885984 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886012 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886038 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886064 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886088 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886111 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886138 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886162 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886191 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886227 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886258 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886273 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886285 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886318 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886346 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886375 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886399 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886439 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886464 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886493 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886390 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886516 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886565 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886590 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886616 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886643 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886672 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886686 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886734 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886764 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886791 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886817 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886842 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886866 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886893 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886963 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.886994 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887025 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887055 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887083 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887109 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887194 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887237 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887266 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887292 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887316 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887344 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887370 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887400 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887634 5025 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887661 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887676 5025 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887692 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887706 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887724 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887737 5025 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887752 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887765 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887782 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887798 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887812 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887825 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887839 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887853 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887866 5025 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887880 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887894 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887907 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887923 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887938 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887951 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887964 5025 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887977 5025 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.887990 5025 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888004 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888018 5025 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888031 5025 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888043 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888057 5025 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888071 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888084 5025 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888099 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888112 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888126 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888141 5025 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888154 5025 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888168 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888182 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888199 5025 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888219 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888238 5025 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888251 5025 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888264 5025 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888277 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888291 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888304 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888318 5025 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888331 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888373 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888387 5025 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888401 5025 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888414 5025 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888426 5025 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888442 5025 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888471 5025 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888486 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888499 5025 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888513 5025 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888528 5025 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888562 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888575 5025 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888587 5025 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888602 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.888615 5025 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.889360 5025 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.889483 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:54.389450789 +0000 UTC m=+21.198764943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.889868 5025 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.889940 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:54.389923215 +0000 UTC m=+21.199237369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.890311 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.891816 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.892407 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.892592 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.892758 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.892521 5025 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.893451 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.894217 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.895218 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.896728 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.897024 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.897448 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.897842 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.899192 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.899302 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.899660 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.899677 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.900115 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.900266 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.900445 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.900995 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.901223 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.902654 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.903325 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.905133 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.905187 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.905779 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.906055 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.906298 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.906608 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.906866 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.907081 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.907487 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.907645 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.908299 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.908347 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.908354 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.908422 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.913658 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.913739 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.913873 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.913906 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.916094 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.916189 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.916499 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.916573 5025 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.916582 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.916692 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:54.416652891 +0000 UTC m=+21.225967035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.916791 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.917020 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.917406 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.917621 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.917603 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.917665 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.917692 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.916025 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.914041 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.918187 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.918184 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.918268 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.918495 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.914156 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.919140 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.920719 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.921262 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.921299 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.921323 5025 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:53 crc kubenswrapper[5025]: E1007 08:16:53.921414 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:54.42138342 +0000 UTC m=+21.230697584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.921829 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.925318 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.925886 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.925983 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.926156 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.926321 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.926412 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.926704 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.926729 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.926783 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.926871 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.926445 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.927064 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.926906 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.927591 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.927592 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.927646 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.928178 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.928046 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.929093 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.929926 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.930022 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.930277 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.930078 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.930091 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.930137 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.930519 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.930601 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.931103 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.931500 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.931560 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.931491 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.931713 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.931927 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.932053 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.932070 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.932588 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.934581 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.934691 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.934887 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.934903 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.935302 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.935367 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.935321 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.935398 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.935646 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.935706 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.935750 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.935914 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.936517 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.936558 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.936611 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.937184 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.937222 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.937695 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.937757 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.938218 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.938446 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.938713 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.939172 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.939286 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.939783 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.939916 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.940566 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.940455 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.940867 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.940909 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.941021 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.941056 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.941319 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.941537 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.941820 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.941954 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.942475 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.942822 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.943272 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.943468 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.943610 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.943768 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.944300 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.944368 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.944482 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.945110 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.945177 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.945663 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.952189 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.955087 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.955593 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.956389 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.959891 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.964015 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.965465 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.966980 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.967663 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.969045 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.969603 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.970403 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.971559 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.977885 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.978150 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.978932 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.979873 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.980001 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.981073 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.981688 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.982438 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.983866 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.984509 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.985927 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.986511 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.986593 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.987483 5025 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.987625 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.989022 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.989192 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.989295 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.989648 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.989715 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.989801 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.989877 5025 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990196 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990206 5025 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990214 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990223 5025 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990232 5025 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990241 5025 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990251 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990255 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990260 5025 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990368 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990379 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990388 5025 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990397 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990407 5025 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990414 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990424 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990434 5025 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990444 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990452 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990460 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990469 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990478 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990486 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990495 5025 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990621 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990632 5025 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990641 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990650 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990658 5025 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990666 5025 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990674 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990684 5025 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990693 5025 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990702 5025 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990710 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990718 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990727 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990736 5025 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990745 5025 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990753 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990761 5025 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990769 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990777 5025 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990786 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990795 5025 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990804 5025 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990814 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990824 5025 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.990833 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991030 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991042 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991051 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991059 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991068 5025 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991076 5025 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991085 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991093 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991102 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991110 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991118 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991128 5025 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991137 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991675 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991687 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991696 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991705 5025 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.991208 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.992591 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.992668 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.992735 5025 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.992789 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.992841 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.992892 5025 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.992942 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.992992 5025 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993047 5025 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993098 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993148 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993201 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993254 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993304 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993359 5025 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993420 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993485 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993562 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993625 5025 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993686 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993744 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993799 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993856 5025 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.993384 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994018 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994146 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994204 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994254 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994306 5025 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994364 5025 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994420 5025 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994476 5025 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994532 5025 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994617 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994672 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994722 5025 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994779 5025 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994845 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994899 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994958 5025 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995020 5025 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995074 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995133 5025 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995187 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995267 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995343 5025 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.994799 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995408 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995491 5025 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995524 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995567 5025 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995581 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995591 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995601 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995649 5025 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995660 5025 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995672 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995682 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995691 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995700 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995710 5025 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995719 5025 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995727 5025 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995736 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.995745 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.996189 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.997294 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.998132 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.999070 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.999591 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:53 crc kubenswrapper[5025]: I1007 08:16:53.999743 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.000885 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.001960 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.002487 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.003560 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.004159 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.005440 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.006041 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.006646 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.007581 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.008201 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.008604 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.009406 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.010111 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.024925 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.027591 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.028926 5025 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.029187 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.039367 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.042211 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.048624 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.053255 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.073261 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.078211 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.086814 5025 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa" exitCode=255 Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.086867 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa"} Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.097966 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.098613 5025 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.098906 5025 scope.go:117] "RemoveContainer" containerID="5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.118388 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.148960 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.151695 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.159045 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.166735 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.182280 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: W1007 08:16:54.195984 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b5b71f879e165d57f1fe861b706820355c3bdb6053a9eaec96c31b7db793748e WatchSource:0}: Error finding container b5b71f879e165d57f1fe861b706820355c3bdb6053a9eaec96c31b7db793748e: Status 404 returned error can't find the container with id b5b71f879e165d57f1fe861b706820355c3bdb6053a9eaec96c31b7db793748e Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.207242 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.233968 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.259251 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.305892 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.321015 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.347714 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.361878 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.400881 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.401020 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.401050 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.401147 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:16:55.401113977 +0000 UTC m=+22.210428121 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.401161 5025 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.401192 5025 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.401229 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:55.401216961 +0000 UTC m=+22.210531095 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.401263 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:55.401242881 +0000 UTC m=+22.210557195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.501559 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.501613 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.501824 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.501877 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.501892 5025 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.501830 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.501966 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:55.501943838 +0000 UTC m=+22.311257982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.501981 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.502001 5025 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:54 crc kubenswrapper[5025]: E1007 08:16:54.502080 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:55.502051003 +0000 UTC m=+22.311365157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.644569 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hc88w"] Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.644934 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hc88w" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.648627 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.648696 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.651935 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.670210 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.683351 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.703226 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swvcf\" (UniqueName: \"kubernetes.io/projected/2744919e-82fb-4c3c-8776-c2c9c44af6e1-kube-api-access-swvcf\") pod \"node-resolver-hc88w\" (UID: \"2744919e-82fb-4c3c-8776-c2c9c44af6e1\") " pod="openshift-dns/node-resolver-hc88w" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.703308 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2744919e-82fb-4c3c-8776-c2c9c44af6e1-hosts-file\") pod \"node-resolver-hc88w\" (UID: \"2744919e-82fb-4c3c-8776-c2c9c44af6e1\") " pod="openshift-dns/node-resolver-hc88w" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.704649 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.718386 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.730309 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.740828 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.754418 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.764369 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.804750 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2744919e-82fb-4c3c-8776-c2c9c44af6e1-hosts-file\") pod \"node-resolver-hc88w\" (UID: \"2744919e-82fb-4c3c-8776-c2c9c44af6e1\") " pod="openshift-dns/node-resolver-hc88w" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.804795 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swvcf\" (UniqueName: \"kubernetes.io/projected/2744919e-82fb-4c3c-8776-c2c9c44af6e1-kube-api-access-swvcf\") pod \"node-resolver-hc88w\" (UID: \"2744919e-82fb-4c3c-8776-c2c9c44af6e1\") " pod="openshift-dns/node-resolver-hc88w" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.805220 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2744919e-82fb-4c3c-8776-c2c9c44af6e1-hosts-file\") pod \"node-resolver-hc88w\" (UID: \"2744919e-82fb-4c3c-8776-c2c9c44af6e1\") " pod="openshift-dns/node-resolver-hc88w" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.824051 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swvcf\" (UniqueName: \"kubernetes.io/projected/2744919e-82fb-4c3c-8776-c2c9c44af6e1-kube-api-access-swvcf\") pod \"node-resolver-hc88w\" (UID: \"2744919e-82fb-4c3c-8776-c2c9c44af6e1\") " pod="openshift-dns/node-resolver-hc88w" Oct 07 08:16:54 crc kubenswrapper[5025]: I1007 08:16:54.955507 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hc88w" Oct 07 08:16:54 crc kubenswrapper[5025]: W1007 08:16:54.968152 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2744919e_82fb_4c3c_8776_c2c9c44af6e1.slice/crio-e4a19b33418cdde1ccf0900361bc96638be83fd104396d063b67f7bbd0d6d827 WatchSource:0}: Error finding container e4a19b33418cdde1ccf0900361bc96638be83fd104396d063b67f7bbd0d6d827: Status 404 returned error can't find the container with id e4a19b33418cdde1ccf0900361bc96638be83fd104396d063b67f7bbd0d6d827 Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.055850 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xmhw6"] Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.056231 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.059059 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.059223 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.066083 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.066293 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.066403 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.073800 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-l2k8t"] Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.074534 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.080801 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.082585 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.096158 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.097156 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2dj2t"] Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.097652 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.106428 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.106465 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.106699 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.106817 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107241 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-hostroot\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107283 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-cni-dir\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107301 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-var-lib-cni-multus\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107322 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-socket-dir-parent\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107341 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-conf-dir\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107358 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-daemon-config\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107379 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx7qx\" (UniqueName: \"kubernetes.io/projected/34b07a69-1bbf-4019-b824-7b5be0f9404d-kube-api-access-gx7qx\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107506 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-var-lib-cni-bin\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107577 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-cnibin\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107606 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-run-netns\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107623 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-run-multus-certs\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107651 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-system-cni-dir\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107673 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34b07a69-1bbf-4019-b824-7b5be0f9404d-cni-binary-copy\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107700 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-os-release\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107721 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-etc-kubernetes\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107745 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-run-k8s-cni-cncf-io\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.107764 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-var-lib-kubelet\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.108476 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b"} Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.108523 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968"} Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.108534 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3115eb2856f5ca324bff088bf3ef22047af88fd4c1c6a295de72cf4d50545e07"} Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.109751 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.117723 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff"} Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.117973 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"474743777dfabf6a941aae4df159b427588678d42a5fc66da1b3c37c97348fa0"} Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.121228 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.127817 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e"} Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.127986 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.129605 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hc88w" event={"ID":"2744919e-82fb-4c3c-8776-c2c9c44af6e1","Type":"ContainerStarted","Data":"e4a19b33418cdde1ccf0900361bc96638be83fd104396d063b67f7bbd0d6d827"} Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.130572 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b5b71f879e165d57f1fe861b706820355c3bdb6053a9eaec96c31b7db793748e"} Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.142964 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.162291 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.175282 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.196325 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208158 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-etc-kubernetes\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208210 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-var-lib-kubelet\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208234 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208257 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-hostroot\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208286 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-system-cni-dir\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208213 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208343 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-etc-kubernetes\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208311 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4849c41-22e1-400e-8e11-096da49ef1b2-rootfs\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208373 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-hostroot\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208399 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-var-lib-kubelet\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208462 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx7qx\" (UniqueName: \"kubernetes.io/projected/34b07a69-1bbf-4019-b824-7b5be0f9404d-kube-api-access-gx7qx\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208495 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pbth\" (UniqueName: \"kubernetes.io/projected/b4849c41-22e1-400e-8e11-096da49ef1b2-kube-api-access-8pbth\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208520 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-conf-dir\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208542 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-daemon-config\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208573 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-var-lib-cni-bin\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208593 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnm4d\" (UniqueName: \"kubernetes.io/projected/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-kube-api-access-gnm4d\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208605 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-conf-dir\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208615 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-run-multus-certs\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208649 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-system-cni-dir\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208651 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-var-lib-cni-bin\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208666 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34b07a69-1bbf-4019-b824-7b5be0f9404d-cni-binary-copy\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208789 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-os-release\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208805 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-run-multus-certs\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208810 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4849c41-22e1-400e-8e11-096da49ef1b2-mcd-auth-proxy-config\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208869 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-run-k8s-cni-cncf-io\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208917 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-run-k8s-cni-cncf-io\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208940 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-var-lib-cni-multus\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208961 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-os-release\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.208999 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-var-lib-cni-multus\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209045 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-cni-dir\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209072 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-system-cni-dir\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209084 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-os-release\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209075 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-socket-dir-parent\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209110 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-socket-dir-parent\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209201 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-cni-binary-copy\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209227 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209304 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-daemon-config\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209356 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4849c41-22e1-400e-8e11-096da49ef1b2-proxy-tls\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209359 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-multus-cni-dir\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209450 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-cnibin\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209483 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-run-netns\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209524 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-cnibin\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209579 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-cnibin\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209626 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34b07a69-1bbf-4019-b824-7b5be0f9404d-cni-binary-copy\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.209664 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34b07a69-1bbf-4019-b824-7b5be0f9404d-host-run-netns\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.221240 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.226334 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx7qx\" (UniqueName: \"kubernetes.io/projected/34b07a69-1bbf-4019-b824-7b5be0f9404d-kube-api-access-gx7qx\") pod \"multus-xmhw6\" (UID: \"34b07a69-1bbf-4019-b824-7b5be0f9404d\") " pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.232378 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.243030 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.255960 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.266796 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.281355 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.296445 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.309094 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310657 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310685 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-system-cni-dir\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310712 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4849c41-22e1-400e-8e11-096da49ef1b2-rootfs\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310729 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pbth\" (UniqueName: \"kubernetes.io/projected/b4849c41-22e1-400e-8e11-096da49ef1b2-kube-api-access-8pbth\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310745 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnm4d\" (UniqueName: \"kubernetes.io/projected/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-kube-api-access-gnm4d\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310763 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4849c41-22e1-400e-8e11-096da49ef1b2-mcd-auth-proxy-config\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310787 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-os-release\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310804 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-cni-binary-copy\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310820 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310849 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4849c41-22e1-400e-8e11-096da49ef1b2-proxy-tls\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310871 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-cnibin\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.310929 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-cnibin\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.311241 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-system-cni-dir\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.311242 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-os-release\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.311646 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4849c41-22e1-400e-8e11-096da49ef1b2-mcd-auth-proxy-config\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.312040 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-cni-binary-copy\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.312153 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.312358 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4849c41-22e1-400e-8e11-096da49ef1b2-rootfs\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.313122 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.316039 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4849c41-22e1-400e-8e11-096da49ef1b2-proxy-tls\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.322815 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.331923 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pbth\" (UniqueName: \"kubernetes.io/projected/b4849c41-22e1-400e-8e11-096da49ef1b2-kube-api-access-8pbth\") pod \"machine-config-daemon-2dj2t\" (UID: \"b4849c41-22e1-400e-8e11-096da49ef1b2\") " pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.333965 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnm4d\" (UniqueName: \"kubernetes.io/projected/4b0ce981-98f4-4e27-9f4d-f9905b78ec9c-kube-api-access-gnm4d\") pod \"multus-additional-cni-plugins-l2k8t\" (UID: \"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\") " pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.338006 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.353771 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.368074 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.370259 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xmhw6" Oct 07 08:16:55 crc kubenswrapper[5025]: W1007 08:16:55.381381 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34b07a69_1bbf_4019_b824_7b5be0f9404d.slice/crio-e900459a7a36e1a8253d3b7ed4b953793724605d27ee0bf72c934a6436b3abe1 WatchSource:0}: Error finding container e900459a7a36e1a8253d3b7ed4b953793724605d27ee0bf72c934a6436b3abe1: Status 404 returned error can't find the container with id e900459a7a36e1a8253d3b7ed4b953793724605d27ee0bf72c934a6436b3abe1 Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.384127 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.387842 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.397574 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: W1007 08:16:55.406226 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b0ce981_98f4_4e27_9f4d_f9905b78ec9c.slice/crio-f842ec25f3f69857b96e240cf51ec8197ef3443a5b2f3461e5726379460bfd43 WatchSource:0}: Error finding container f842ec25f3f69857b96e240cf51ec8197ef3443a5b2f3461e5726379460bfd43: Status 404 returned error can't find the container with id f842ec25f3f69857b96e240cf51ec8197ef3443a5b2f3461e5726379460bfd43 Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.409785 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.413874 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.413979 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:16:57.413952972 +0000 UTC m=+24.223267116 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.414031 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.414058 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.414138 5025 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.414186 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:57.41417396 +0000 UTC m=+24.223488104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.414272 5025 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.414362 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:57.414336626 +0000 UTC m=+24.223650930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:16:55 crc kubenswrapper[5025]: W1007 08:16:55.430560 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4849c41_22e1_400e_8e11_096da49ef1b2.slice/crio-59de2e20d9162973aac52b551f1f0bf1f6b6fd1051677f88d66d1e367f2c6811 WatchSource:0}: Error finding container 59de2e20d9162973aac52b551f1f0bf1f6b6fd1051677f88d66d1e367f2c6811: Status 404 returned error can't find the container with id 59de2e20d9162973aac52b551f1f0bf1f6b6fd1051677f88d66d1e367f2c6811 Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.466110 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dwm22"] Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.467023 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.469739 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.469903 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.470051 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.470086 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.470986 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.471508 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.476758 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.485788 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.499778 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.514790 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-systemd-units\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.514849 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-slash\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.514889 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.514925 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.515200 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.515240 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.515254 5025 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515271 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-var-lib-openvswitch\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.515273 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.515324 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:57.515300272 +0000 UTC m=+24.324614416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.515336 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.515400 5025 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515411 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-etc-openvswitch\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515458 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-ovn\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.515502 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 08:16:57.515454447 +0000 UTC m=+24.324768781 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515579 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-env-overrides\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515666 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-netns\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515704 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovn-node-metrics-cert\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515738 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515779 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-kubelet\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515799 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-node-log\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515818 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515832 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.515877 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpm25\" (UniqueName: \"kubernetes.io/projected/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-kube-api-access-zpm25\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.516002 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-netd\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.516223 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-systemd\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.516343 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-openvswitch\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.516426 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-script-lib\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.516483 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-bin\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.516508 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-config\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.516571 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-log-socket\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.542231 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.555444 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.574945 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.589933 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.606361 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617586 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-var-lib-openvswitch\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617624 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-etc-openvswitch\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617653 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-netns\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617680 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-ovn\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617706 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-env-overrides\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617775 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-var-lib-openvswitch\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617816 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovn-node-metrics-cert\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617784 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-ovn\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617825 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-etc-openvswitch\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617775 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-netns\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617890 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-kubelet\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617909 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-kubelet\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617911 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617935 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.617981 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-node-log\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618001 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618022 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpm25\" (UniqueName: \"kubernetes.io/projected/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-kube-api-access-zpm25\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618068 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-node-log\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618081 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618106 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-netd\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618134 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-systemd\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618168 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-openvswitch\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618177 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-systemd\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618188 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-bin\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618204 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-openvswitch\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618205 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-config\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618231 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-script-lib\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618246 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-bin\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618264 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-log-socket\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618247 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-netd\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618311 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-systemd-units\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618322 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-log-socket\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618287 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-systemd-units\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618383 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-slash\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.618476 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-slash\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.619126 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-script-lib\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.619189 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-env-overrides\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.619699 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-config\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.626162 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.638827 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.653240 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.664576 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovn-node-metrics-cert\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.665333 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpm25\" (UniqueName: \"kubernetes.io/projected/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-kube-api-access-zpm25\") pod \"ovnkube-node-dwm22\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.669027 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:55Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.800860 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:16:55 crc kubenswrapper[5025]: W1007 08:16:55.813319 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b6b9c75_ecfe_4815_b279_bb56f57a82a8.slice/crio-15b2374e4b04e9cc206a6399ef07641377372a41d7c5a0d9cec0492d4db986e8 WatchSource:0}: Error finding container 15b2374e4b04e9cc206a6399ef07641377372a41d7c5a0d9cec0492d4db986e8: Status 404 returned error can't find the container with id 15b2374e4b04e9cc206a6399ef07641377372a41d7c5a0d9cec0492d4db986e8 Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.915671 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.915805 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.915851 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.915876 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.915966 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:16:55 crc kubenswrapper[5025]: E1007 08:16:55.916103 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.917869 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.918372 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.919090 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 07 08:16:55 crc kubenswrapper[5025]: I1007 08:16:55.919754 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.134530 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hc88w" event={"ID":"2744919e-82fb-4c3c-8776-c2c9c44af6e1","Type":"ContainerStarted","Data":"afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.136343 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.136382 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.136397 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"59de2e20d9162973aac52b551f1f0bf1f6b6fd1051677f88d66d1e367f2c6811"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.137913 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58" exitCode=0 Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.137980 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.138011 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"15b2374e4b04e9cc206a6399ef07641377372a41d7c5a0d9cec0492d4db986e8"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.139538 5025 generic.go:334] "Generic (PLEG): container finished" podID="4b0ce981-98f4-4e27-9f4d-f9905b78ec9c" containerID="bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af" exitCode=0 Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.139612 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" event={"ID":"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c","Type":"ContainerDied","Data":"bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.139639 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" event={"ID":"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c","Type":"ContainerStarted","Data":"f842ec25f3f69857b96e240cf51ec8197ef3443a5b2f3461e5726379460bfd43"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.146740 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmhw6" event={"ID":"34b07a69-1bbf-4019-b824-7b5be0f9404d","Type":"ContainerStarted","Data":"aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.146789 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmhw6" event={"ID":"34b07a69-1bbf-4019-b824-7b5be0f9404d","Type":"ContainerStarted","Data":"e900459a7a36e1a8253d3b7ed4b953793724605d27ee0bf72c934a6436b3abe1"} Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.156139 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.190701 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.209671 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.226923 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.246630 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.258675 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.277379 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.292572 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.310082 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.321333 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.335764 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.351905 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.366513 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.383305 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.400806 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.416025 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.434982 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.455846 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.482331 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.498804 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.508702 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.521327 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.536075 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:56 crc kubenswrapper[5025]: I1007 08:16:56.549786 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:56Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.154942 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4"} Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.155347 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429"} Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.155361 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f"} Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.155371 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf"} Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.155381 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f"} Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.155392 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe"} Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.157204 5025 generic.go:334] "Generic (PLEG): container finished" podID="4b0ce981-98f4-4e27-9f4d-f9905b78ec9c" containerID="147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c" exitCode=0 Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.157267 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" event={"ID":"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c","Type":"ContainerDied","Data":"147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c"} Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.162148 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5"} Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.179236 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.196157 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.210330 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.232637 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.250151 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.267035 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.283499 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.298097 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.316252 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.329242 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.342873 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.346535 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jf557"] Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.347024 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.348877 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.349123 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.349476 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.349993 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.361081 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.375502 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.387875 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.399863 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.411320 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.430427 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.442754 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.442853 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.442880 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea42823f-6b19-439f-a280-80e5b1a816c7-host\") pod \"node-ca-jf557\" (UID: \"ea42823f-6b19-439f-a280-80e5b1a816c7\") " pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.442921 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.442945 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqrdl\" (UniqueName: \"kubernetes.io/projected/ea42823f-6b19-439f-a280-80e5b1a816c7-kube-api-access-hqrdl\") pod \"node-ca-jf557\" (UID: \"ea42823f-6b19-439f-a280-80e5b1a816c7\") " pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.442977 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:17:01.442948925 +0000 UTC m=+28.252263069 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.443024 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ea42823f-6b19-439f-a280-80e5b1a816c7-serviceca\") pod \"node-ca-jf557\" (UID: \"ea42823f-6b19-439f-a280-80e5b1a816c7\") " pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.443029 5025 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.443079 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:01.443064259 +0000 UTC m=+28.252378403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.443102 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.443177 5025 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.443298 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:01.443288006 +0000 UTC m=+28.252602350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.457467 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.471497 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.488617 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.500727 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.513255 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.524607 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.537484 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:57Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.544704 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea42823f-6b19-439f-a280-80e5b1a816c7-host\") pod \"node-ca-jf557\" (UID: \"ea42823f-6b19-439f-a280-80e5b1a816c7\") " pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.544755 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.544787 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.544830 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea42823f-6b19-439f-a280-80e5b1a816c7-host\") pod \"node-ca-jf557\" (UID: \"ea42823f-6b19-439f-a280-80e5b1a816c7\") " pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.544838 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqrdl\" (UniqueName: \"kubernetes.io/projected/ea42823f-6b19-439f-a280-80e5b1a816c7-kube-api-access-hqrdl\") pod \"node-ca-jf557\" (UID: \"ea42823f-6b19-439f-a280-80e5b1a816c7\") " pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.544914 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ea42823f-6b19-439f-a280-80e5b1a816c7-serviceca\") pod \"node-ca-jf557\" (UID: \"ea42823f-6b19-439f-a280-80e5b1a816c7\") " pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.545055 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.545080 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.545147 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.545169 5025 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.545109 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.545243 5025 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.545256 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:01.545228205 +0000 UTC m=+28.354542389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.545327 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:01.545296867 +0000 UTC m=+28.354611181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.546038 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ea42823f-6b19-439f-a280-80e5b1a816c7-serviceca\") pod \"node-ca-jf557\" (UID: \"ea42823f-6b19-439f-a280-80e5b1a816c7\") " pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.566221 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqrdl\" (UniqueName: \"kubernetes.io/projected/ea42823f-6b19-439f-a280-80e5b1a816c7-kube-api-access-hqrdl\") pod \"node-ca-jf557\" (UID: \"ea42823f-6b19-439f-a280-80e5b1a816c7\") " pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.764508 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jf557" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.914764 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.915194 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.914847 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.915349 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:16:57 crc kubenswrapper[5025]: I1007 08:16:57.914793 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:57 crc kubenswrapper[5025]: E1007 08:16:57.915452 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.169346 5025 generic.go:334] "Generic (PLEG): container finished" podID="4b0ce981-98f4-4e27-9f4d-f9905b78ec9c" containerID="49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495" exitCode=0 Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.169499 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" event={"ID":"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c","Type":"ContainerDied","Data":"49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495"} Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.172770 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jf557" event={"ID":"ea42823f-6b19-439f-a280-80e5b1a816c7","Type":"ContainerStarted","Data":"7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89"} Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.172839 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jf557" event={"ID":"ea42823f-6b19-439f-a280-80e5b1a816c7","Type":"ContainerStarted","Data":"6f9a9b5e1a279b69f038142983cf983e63cb442b99978a2b55ac97b8ced4c6b1"} Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.191230 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.207474 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.226882 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.241406 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.260884 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.277694 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.293945 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.309849 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.327603 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.343315 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.355506 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.368639 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.381367 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.397871 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.410185 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.421880 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.435193 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.448884 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.464898 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.481317 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.496534 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.519527 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.533957 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.555024 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.570956 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:58 crc kubenswrapper[5025]: I1007 08:16:58.584531 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:58Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.181633 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20"} Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.185818 5025 generic.go:334] "Generic (PLEG): container finished" podID="4b0ce981-98f4-4e27-9f4d-f9905b78ec9c" containerID="5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7" exitCode=0 Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.185853 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" event={"ID":"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c","Type":"ContainerDied","Data":"5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7"} Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.202102 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.222951 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.238356 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.255224 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.267542 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.287824 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.305239 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.317603 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.334137 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.354865 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.372824 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.387327 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.414039 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.567801 5025 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.571169 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.571212 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.571224 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.571908 5025 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.580409 5025 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.580685 5025 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.581715 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.581732 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.581742 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.581756 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.581767 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:16:59Z","lastTransitionTime":"2025-10-07T08:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:16:59 crc kubenswrapper[5025]: E1007 08:16:59.600529 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.603968 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.603998 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.604009 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.604028 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.604042 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:16:59Z","lastTransitionTime":"2025-10-07T08:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:16:59 crc kubenswrapper[5025]: E1007 08:16:59.622318 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.627983 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.628025 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.628038 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.628055 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.628069 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:16:59Z","lastTransitionTime":"2025-10-07T08:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:16:59 crc kubenswrapper[5025]: E1007 08:16:59.653207 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.662464 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.662499 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.662510 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.662530 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.662546 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:16:59Z","lastTransitionTime":"2025-10-07T08:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:16:59 crc kubenswrapper[5025]: E1007 08:16:59.691802 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.697531 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.697583 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.697593 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.697609 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.697622 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:16:59Z","lastTransitionTime":"2025-10-07T08:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:16:59 crc kubenswrapper[5025]: E1007 08:16:59.714807 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: E1007 08:16:59.714977 5025 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.716445 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.716479 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.716499 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.716516 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.716527 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:16:59Z","lastTransitionTime":"2025-10-07T08:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.818837 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.818868 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.818878 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.818895 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.818905 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:16:59Z","lastTransitionTime":"2025-10-07T08:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.914247 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.914382 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:16:59 crc kubenswrapper[5025]: E1007 08:16:59.914488 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.914514 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:16:59 crc kubenswrapper[5025]: E1007 08:16:59.914674 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:16:59 crc kubenswrapper[5025]: E1007 08:16:59.914789 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.921838 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.922735 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.922758 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.922768 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.922782 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.922796 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:16:59Z","lastTransitionTime":"2025-10-07T08:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.925896 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.938969 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.939553 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.955195 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.967829 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:16:59 crc kubenswrapper[5025]: I1007 08:16:59.980903 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:16:59Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.019767 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.027954 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.027982 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.027991 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.028008 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.028019 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.029416 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.041102 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.052026 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.066532 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.079510 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.091568 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.110884 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.130896 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.130958 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.130973 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.130995 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.131039 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.131305 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.152734 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.172448 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.189214 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.194103 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" event={"ID":"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c","Type":"ContainerDied","Data":"4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.194013 5025 generic.go:334] "Generic (PLEG): container finished" podID="4b0ce981-98f4-4e27-9f4d-f9905b78ec9c" containerID="4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31" exitCode=0 Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.221012 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.234128 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.234180 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.234198 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.234223 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.234241 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.237390 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.257369 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.275219 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.287217 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.305293 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.320297 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.336002 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.337500 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.337528 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.337542 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.337595 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.337610 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.352171 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.370269 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.388003 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.403890 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.422237 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.439803 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.441738 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.441778 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.441794 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.441925 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.441962 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.453272 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.465898 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.477012 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.491024 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.516870 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.544654 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.544680 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.544696 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.544715 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.544726 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.562744 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.598795 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.639197 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.646857 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.646978 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.647051 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.647143 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.647225 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.680248 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.723836 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.750253 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.750297 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.750318 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.750337 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.750349 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.766365 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:00Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.853388 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.853435 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.853444 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.853460 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.853470 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.956421 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.956770 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.956907 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.957067 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:00 crc kubenswrapper[5025]: I1007 08:17:00.957227 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:00Z","lastTransitionTime":"2025-10-07T08:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.059590 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.059655 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.059673 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.059699 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.059718 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:01Z","lastTransitionTime":"2025-10-07T08:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.162256 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.162773 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.162789 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.162811 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.162826 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:01Z","lastTransitionTime":"2025-10-07T08:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.201484 5025 generic.go:334] "Generic (PLEG): container finished" podID="4b0ce981-98f4-4e27-9f4d-f9905b78ec9c" containerID="7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660" exitCode=0 Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.201604 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" event={"ID":"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c","Type":"ContainerDied","Data":"7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.211948 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.213429 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.213518 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.220932 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.241411 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.291058 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.291110 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.291128 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.291206 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.291224 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:01Z","lastTransitionTime":"2025-10-07T08:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.294493 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.301845 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.302221 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.314010 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.345373 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.381389 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.394159 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.394202 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.394212 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.394230 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.394241 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:01Z","lastTransitionTime":"2025-10-07T08:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.404798 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.432067 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.455599 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.469769 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.486781 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.491844 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.492035 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:17:09.492002819 +0000 UTC m=+36.301316963 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.492098 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.492144 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.492251 5025 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.492281 5025 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.492375 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:09.492356121 +0000 UTC m=+36.301670415 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.492397 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:09.492389192 +0000 UTC m=+36.301703326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.496148 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.496169 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.496180 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.496197 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.496208 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:01Z","lastTransitionTime":"2025-10-07T08:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.501644 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.516023 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.530064 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.550078 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.565519 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.581742 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.593613 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.593656 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.593821 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.593838 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.593849 5025 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.593964 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:09.593948338 +0000 UTC m=+36.403262482 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.594009 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.594068 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.594092 5025 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.594201 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:09.594170075 +0000 UTC m=+36.403484259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.598653 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.598698 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.598710 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.598728 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.598740 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:01Z","lastTransitionTime":"2025-10-07T08:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.599828 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.615510 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.640127 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.653660 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.665909 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.677837 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.701737 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.701810 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.701824 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.701850 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.701866 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:01Z","lastTransitionTime":"2025-10-07T08:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.719371 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.760617 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.801470 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.805060 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.805150 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.805177 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.805210 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.805232 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:01Z","lastTransitionTime":"2025-10-07T08:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.843947 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.882776 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.908471 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.908529 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.908561 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.908582 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.908610 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:01Z","lastTransitionTime":"2025-10-07T08:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.914302 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.914357 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:01 crc kubenswrapper[5025]: I1007 08:17:01.914357 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.914453 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.914607 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:01 crc kubenswrapper[5025]: E1007 08:17:01.914893 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.012029 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.012107 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.012128 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.012158 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.012178 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.114979 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.115032 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.115044 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.115060 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.115072 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.218941 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.218979 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.218994 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.219013 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.219025 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.226493 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" event={"ID":"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c","Type":"ContainerStarted","Data":"ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.226583 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.242254 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.252290 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.264701 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.274950 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.283805 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.294912 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.308718 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.321499 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.321566 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.321593 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.321614 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.321629 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.324835 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.342191 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.355743 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.376366 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.393108 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.407943 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.424938 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.424991 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.425011 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.425036 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.425054 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.449582 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:02Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.527627 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.527679 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.527695 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.527718 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.527735 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.630595 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.630635 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.630650 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.630670 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.630686 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.733960 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.734019 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.734029 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.734046 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.734057 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.836430 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.836491 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.836507 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.836530 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.836591 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.938467 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.938513 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.938526 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.938562 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:02 crc kubenswrapper[5025]: I1007 08:17:02.938574 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:02Z","lastTransitionTime":"2025-10-07T08:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.041460 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.041498 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.041507 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.041520 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.041529 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.144395 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.144460 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.144483 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.144511 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.144532 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.229190 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.246939 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.247026 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.247042 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.247061 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.247075 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.350469 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.350518 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.350533 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.350582 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.350600 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.453034 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.453070 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.453079 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.453093 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.453106 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.556213 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.556277 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.556297 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.556335 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.556354 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.659941 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.660701 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.660733 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.660772 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.660794 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.765052 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.765121 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.765142 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.765170 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.765190 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.869987 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.870008 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.870018 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.870031 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.870041 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.914809 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.915016 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:03 crc kubenswrapper[5025]: E1007 08:17:03.915329 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.915473 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:03 crc kubenswrapper[5025]: E1007 08:17:03.915652 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:03 crc kubenswrapper[5025]: E1007 08:17:03.915795 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.930179 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.943855 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.960498 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.973001 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.973047 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.973058 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.973075 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.973085 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:03Z","lastTransitionTime":"2025-10-07T08:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.973711 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.985764 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:03 crc kubenswrapper[5025]: I1007 08:17:03.997799 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.011195 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.022244 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.036685 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.051776 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.064781 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.076322 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.076413 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.076436 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.076470 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.076491 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:04Z","lastTransitionTime":"2025-10-07T08:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.084545 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.102896 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.119679 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.179436 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.179496 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.179512 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.179541 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.179590 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:04Z","lastTransitionTime":"2025-10-07T08:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.234596 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/0.log" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.238438 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab" exitCode=1 Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.238500 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab"} Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.239796 5025 scope.go:117] "RemoveContainer" containerID="8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.263349 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.284152 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.284205 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.284223 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.284247 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.284266 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:04Z","lastTransitionTime":"2025-10-07T08:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.284810 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.303503 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.315992 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.333089 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.349085 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.364380 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.375794 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.386935 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.386967 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.386977 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.386992 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.387002 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:04Z","lastTransitionTime":"2025-10-07T08:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.390577 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.426479 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.454309 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.476995 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.489776 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.489804 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.489812 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.489824 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.489833 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:04Z","lastTransitionTime":"2025-10-07T08:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.501854 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.525282 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:04Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 08:17:04.009043 6381 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 08:17:04.009983 6381 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 08:17:04.010131 6381 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 08:17:04.010160 6381 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 08:17:04.010383 6381 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 08:17:04.010437 6381 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 08:17:04.010454 6381 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 08:17:04.010461 6381 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 08:17:04.010478 6381 factory.go:656] Stopping watch factory\\\\nI1007 08:17:04.010498 6381 ovnkube.go:599] Stopped ovnkube\\\\nI1007 08:17:04.010530 6381 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 08:17:04.010564 6381 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 08:17:04.010573 6381 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 08:17:04.010589 6381 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.592514 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.592580 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.592591 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.592606 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.592616 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:04Z","lastTransitionTime":"2025-10-07T08:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.694931 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.694985 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.695003 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.695029 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.695047 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:04Z","lastTransitionTime":"2025-10-07T08:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.797482 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.797667 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.797703 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.797735 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.797757 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:04Z","lastTransitionTime":"2025-10-07T08:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.901453 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.901511 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.901531 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.901599 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:04 crc kubenswrapper[5025]: I1007 08:17:04.901620 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:04Z","lastTransitionTime":"2025-10-07T08:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.004725 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.004793 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.004813 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.004845 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.004871 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.107377 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.107614 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.107628 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.107641 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.107649 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.210236 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.210501 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.210626 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.210752 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.210826 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.243059 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/0.log" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.249366 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.249500 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.272023 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.291751 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.302621 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.312974 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.312998 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.313006 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.313020 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.313029 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.314620 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.333034 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:04Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 08:17:04.009043 6381 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 08:17:04.009983 6381 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 08:17:04.010131 6381 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 08:17:04.010160 6381 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 08:17:04.010383 6381 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 08:17:04.010437 6381 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 08:17:04.010454 6381 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 08:17:04.010461 6381 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 08:17:04.010478 6381 factory.go:656] Stopping watch factory\\\\nI1007 08:17:04.010498 6381 ovnkube.go:599] Stopped ovnkube\\\\nI1007 08:17:04.010530 6381 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 08:17:04.010564 6381 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 08:17:04.010573 6381 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 08:17:04.010589 6381 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.348899 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.367325 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.383422 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.396002 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.408186 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.415462 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.415718 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.415754 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.415775 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.415786 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.420586 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.430866 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.442778 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.456264 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:05Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.518686 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.518738 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.518757 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.518781 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.518800 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.621454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.621797 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.621892 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.621966 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.622051 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.725646 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.725687 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.725698 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.725718 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.725729 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.828189 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.828260 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.828274 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.828319 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.828335 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.913898 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.913980 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.913910 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:05 crc kubenswrapper[5025]: E1007 08:17:05.914100 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:05 crc kubenswrapper[5025]: E1007 08:17:05.914205 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:05 crc kubenswrapper[5025]: E1007 08:17:05.914279 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.929870 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.929907 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.929917 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.929933 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:05 crc kubenswrapper[5025]: I1007 08:17:05.929944 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:05Z","lastTransitionTime":"2025-10-07T08:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.033634 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.033704 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.033728 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.033758 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.033786 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.137108 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.137364 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.137387 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.137420 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.137443 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.241075 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.241126 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.241138 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.241155 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.241337 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.255384 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/1.log" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.256230 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/0.log" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.259250 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5" exitCode=1 Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.259314 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.259398 5025 scope.go:117] "RemoveContainer" containerID="8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.261644 5025 scope.go:117] "RemoveContainer" containerID="15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5" Oct 07 08:17:06 crc kubenswrapper[5025]: E1007 08:17:06.261976 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.280894 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.306814 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2ef3761e7cca150079a4dc9bfad8f8a4f81792d388eca4fc6f539038b929ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:04Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 08:17:04.009043 6381 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 08:17:04.009983 6381 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 08:17:04.010131 6381 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1007 08:17:04.010160 6381 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 08:17:04.010383 6381 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 08:17:04.010437 6381 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 08:17:04.010454 6381 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 08:17:04.010461 6381 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 08:17:04.010478 6381 factory.go:656] Stopping watch factory\\\\nI1007 08:17:04.010498 6381 ovnkube.go:599] Stopped ovnkube\\\\nI1007 08:17:04.010530 6381 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 08:17:04.010564 6381 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 08:17:04.010573 6381 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 08:17:04.010589 6381 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:05Z\\\",\\\"message\\\":\\\"== {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353289 6518 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353308 6518 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 08:17:05.351793 6518 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.326294 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.344360 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.344401 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.344413 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.344433 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.344448 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.345755 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.375387 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.390894 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.403989 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.418048 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.431420 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.443811 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.446670 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.446710 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.446722 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.446739 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.446752 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.457757 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.475139 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.488407 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.506931 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:06Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.552216 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.552252 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.552262 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.552277 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.552286 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.654236 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.654635 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.654813 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.654987 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.655123 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.757638 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.757688 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.757701 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.757717 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.757728 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.860868 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.860925 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.860941 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.860962 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.860978 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.863130 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.964384 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.964442 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.964458 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.964480 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:06 crc kubenswrapper[5025]: I1007 08:17:06.964496 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:06Z","lastTransitionTime":"2025-10-07T08:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.067245 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.067299 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.067318 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.067340 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.067356 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.170037 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.170074 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.170084 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.170100 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.170111 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.264963 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/1.log" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.268949 5025 scope.go:117] "RemoveContainer" containerID="15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5" Oct 07 08:17:07 crc kubenswrapper[5025]: E1007 08:17:07.269127 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.272136 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.272168 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.272201 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.272218 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.272229 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.289998 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.308105 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.323752 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.338208 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.358945 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.375137 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.375200 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.375224 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.375255 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.375272 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.375458 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.393692 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.409822 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.427180 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.449124 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.477506 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.477570 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.477584 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.477602 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.477616 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.484702 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:05Z\\\",\\\"message\\\":\\\"== {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353289 6518 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353308 6518 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 08:17:05.351793 6518 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.505558 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.524079 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.544472 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.580030 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.580348 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.580444 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.580569 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.580669 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.683844 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.684143 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.684215 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.684304 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.684388 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.787075 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.787119 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.787153 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.787170 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.787180 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.890049 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.890107 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.890123 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.890149 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.890166 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.910369 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj"] Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.911490 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.913221 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.914698 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.914823 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:07 crc kubenswrapper[5025]: E1007 08:17:07.915621 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.915370 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:07 crc kubenswrapper[5025]: E1007 08:17:07.915686 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.914942 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:07 crc kubenswrapper[5025]: E1007 08:17:07.915926 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.931608 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.948625 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.963259 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.975583 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.987008 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.992445 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.992472 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.992482 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.992495 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:07 crc kubenswrapper[5025]: I1007 08:17:07.992503 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:07Z","lastTransitionTime":"2025-10-07T08:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.000744 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:07Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.011188 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:08Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.028362 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:08Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.045580 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:08Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.062773 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:08Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.064477 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.064630 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckns7\" (UniqueName: \"kubernetes.io/projected/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-kube-api-access-ckns7\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.064721 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.064841 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.075438 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:08Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.089081 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:08Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.094126 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.094159 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.094168 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.094183 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.094195 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:08Z","lastTransitionTime":"2025-10-07T08:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.103743 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:08Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.116223 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:08Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.133869 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:05Z\\\",\\\"message\\\":\\\"== {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353289 6518 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353308 6518 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 08:17:05.351793 6518 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:08Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.166390 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.166771 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.167853 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.167956 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckns7\" (UniqueName: \"kubernetes.io/projected/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-kube-api-access-ckns7\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.167182 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.168184 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.172885 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.187866 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckns7\" (UniqueName: \"kubernetes.io/projected/ee70c47a-d783-4a9f-8e94-5fc68eda69fb-kube-api-access-ckns7\") pod \"ovnkube-control-plane-749d76644c-w8khj\" (UID: \"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.196505 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.196557 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.196566 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.196579 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.196588 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:08Z","lastTransitionTime":"2025-10-07T08:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.236024 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.273739 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" event={"ID":"ee70c47a-d783-4a9f-8e94-5fc68eda69fb","Type":"ContainerStarted","Data":"e4d3e0d64f330695031d112ee9fd94bffb33c5b57babd2b24a3eb5d53a7baf82"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.298983 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.299312 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.299454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.299659 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.299778 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:08Z","lastTransitionTime":"2025-10-07T08:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.402946 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.403079 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.403107 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.403145 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.403170 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:08Z","lastTransitionTime":"2025-10-07T08:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.507375 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.507438 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.507458 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.507484 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.507504 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:08Z","lastTransitionTime":"2025-10-07T08:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.610475 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.610543 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.610588 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.610606 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.610617 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:08Z","lastTransitionTime":"2025-10-07T08:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.713788 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.713862 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.713881 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.713917 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.713936 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:08Z","lastTransitionTime":"2025-10-07T08:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.816397 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.816478 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.816498 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.816532 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.816579 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:08Z","lastTransitionTime":"2025-10-07T08:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.920107 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.920155 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.920169 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.920185 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:08 crc kubenswrapper[5025]: I1007 08:17:08.920201 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:08Z","lastTransitionTime":"2025-10-07T08:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.002767 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-f4ls7"] Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.003205 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.003280 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.020091 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.023090 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.023145 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.023164 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.023188 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.023218 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.042246 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.060936 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.074605 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.089715 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.105662 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.120705 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.126331 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.126394 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.126410 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.126435 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.126460 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.143516 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.163337 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.178533 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfw77\" (UniqueName: \"kubernetes.io/projected/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-kube-api-access-pfw77\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.178816 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.180746 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.195941 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.216906 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.229865 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.229919 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.229935 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.229953 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.229965 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.240508 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:05Z\\\",\\\"message\\\":\\\"== {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353289 6518 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353308 6518 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 08:17:05.351793 6518 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.257623 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.275370 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.280239 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfw77\" (UniqueName: \"kubernetes.io/projected/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-kube-api-access-pfw77\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.280499 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.280870 5025 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.280996 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs podName:93fdeab4-b5d2-42d8-97ca-d5d61032e19f nodeName:}" failed. No retries permitted until 2025-10-07 08:17:09.780944481 +0000 UTC m=+36.590258665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs") pod "network-metrics-daemon-f4ls7" (UID: "93fdeab4-b5d2-42d8-97ca-d5d61032e19f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.284753 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" event={"ID":"ee70c47a-d783-4a9f-8e94-5fc68eda69fb","Type":"ContainerStarted","Data":"ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.284827 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" event={"ID":"ee70c47a-d783-4a9f-8e94-5fc68eda69fb","Type":"ContainerStarted","Data":"e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.302762 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.314598 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfw77\" (UniqueName: \"kubernetes.io/projected/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-kube-api-access-pfw77\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.324405 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.337680 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.337734 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.337749 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.337774 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.337791 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.348793 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.366533 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.388810 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.410823 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.441792 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.441860 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.441878 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.441906 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.441925 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.446253 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:05Z\\\",\\\"message\\\":\\\"== {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353289 6518 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353308 6518 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 08:17:05.351793 6518 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.469872 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.486404 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.505493 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.525323 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.545758 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.545824 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.545841 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.545873 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.545893 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.552256 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.572175 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.583815 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.584021 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.584059 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.584180 5025 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.584249 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:25.584229841 +0000 UTC m=+52.393543985 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.584695 5025 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.584756 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:25.584748399 +0000 UTC m=+52.394062533 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.584909 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:17:25.584850682 +0000 UTC m=+52.394164866 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.599439 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.619187 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.637002 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.649588 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.649676 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.649699 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.649760 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.649784 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.654457 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.685242 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.685301 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.685543 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.685598 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.685592 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.685675 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.685696 5025 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.685624 5025 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.685782 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:25.685753256 +0000 UTC m=+52.495067599 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.685928 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:25.68587817 +0000 UTC m=+52.495192344 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.752971 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.753045 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.753066 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.753097 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.753120 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.787005 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.787331 5025 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.787492 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs podName:93fdeab4-b5d2-42d8-97ca-d5d61032e19f nodeName:}" failed. No retries permitted until 2025-10-07 08:17:10.787454507 +0000 UTC m=+37.596768851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs") pod "network-metrics-daemon-f4ls7" (UID: "93fdeab4-b5d2-42d8-97ca-d5d61032e19f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.856354 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.856483 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.856504 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.856533 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.856613 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.906431 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.906483 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.906497 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.906517 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.906530 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.913917 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.914014 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.914061 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.914187 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.914227 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.914502 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.926572 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.932501 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.932594 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.932613 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.932670 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.932690 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.950110 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.963000 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.963073 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.963091 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.963118 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.963141 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:09 crc kubenswrapper[5025]: E1007 08:17:09.987063 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:09Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.991929 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.991961 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.991973 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.991989 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:09 crc kubenswrapper[5025]: I1007 08:17:09.992002 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:09Z","lastTransitionTime":"2025-10-07T08:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: E1007 08:17:10.009669 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:10Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.013972 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.014169 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.014296 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.014454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.014629 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: E1007 08:17:10.029091 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:10Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:10 crc kubenswrapper[5025]: E1007 08:17:10.029243 5025 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.031648 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.031717 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.031730 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.031756 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.031776 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.136984 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.137076 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.137099 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.137129 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.137149 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.241605 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.241683 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.241710 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.241738 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.241756 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.346282 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.346336 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.346349 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.346371 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.346383 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.450896 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.451902 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.452063 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.452265 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.452453 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.555793 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.555850 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.555864 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.555888 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.555907 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.659948 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.660020 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.660037 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.660067 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.660085 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.764800 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.764877 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.764901 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.764932 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.764951 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.804046 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:10 crc kubenswrapper[5025]: E1007 08:17:10.804350 5025 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:10 crc kubenswrapper[5025]: E1007 08:17:10.804459 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs podName:93fdeab4-b5d2-42d8-97ca-d5d61032e19f nodeName:}" failed. No retries permitted until 2025-10-07 08:17:12.804427391 +0000 UTC m=+39.613741565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs") pod "network-metrics-daemon-f4ls7" (UID: "93fdeab4-b5d2-42d8-97ca-d5d61032e19f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.868663 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.868737 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.868753 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.868789 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.868807 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.914283 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:10 crc kubenswrapper[5025]: E1007 08:17:10.914625 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.972707 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.973084 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.973303 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.973532 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:10 crc kubenswrapper[5025]: I1007 08:17:10.973836 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:10Z","lastTransitionTime":"2025-10-07T08:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.077500 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.077611 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.077636 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.077667 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.077690 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:11Z","lastTransitionTime":"2025-10-07T08:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.180240 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.180301 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.180318 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.180337 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.180352 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:11Z","lastTransitionTime":"2025-10-07T08:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.284853 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.284970 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.284996 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.285038 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.285062 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:11Z","lastTransitionTime":"2025-10-07T08:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.388280 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.388347 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.388371 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.388396 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.388414 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:11Z","lastTransitionTime":"2025-10-07T08:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.491460 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.491496 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.491506 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.491519 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.491528 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:11Z","lastTransitionTime":"2025-10-07T08:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.594042 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.594153 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.594180 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.594212 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.594233 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:11Z","lastTransitionTime":"2025-10-07T08:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.703236 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.703326 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.703349 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.703379 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.703409 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:11Z","lastTransitionTime":"2025-10-07T08:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.807748 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.807815 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.807841 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.807874 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.807897 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:11Z","lastTransitionTime":"2025-10-07T08:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.914400 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.914646 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.914677 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.914714 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.914736 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:11Z","lastTransitionTime":"2025-10-07T08:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.915538 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:11 crc kubenswrapper[5025]: E1007 08:17:11.915738 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.916269 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:11 crc kubenswrapper[5025]: E1007 08:17:11.916398 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:11 crc kubenswrapper[5025]: I1007 08:17:11.916742 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:11 crc kubenswrapper[5025]: E1007 08:17:11.916911 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.018260 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.018310 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.018326 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.018352 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.018370 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.121270 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.121326 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.121341 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.121358 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.121371 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.225339 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.225425 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.225452 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.225496 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.225521 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.292504 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.318435 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.329292 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.329346 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.329359 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.329378 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.329393 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.344964 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.369422 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.394891 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.416873 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.433036 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.433094 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.433114 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.433145 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.433167 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.435402 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.457591 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.477164 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.498099 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.518070 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.536424 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.536481 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.536500 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.536525 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.536582 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.543614 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.569107 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.592027 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.613336 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.639166 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.639227 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.639244 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.639271 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.639290 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.647897 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:05Z\\\",\\\"message\\\":\\\"== {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353289 6518 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353308 6518 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 08:17:05.351793 6518 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.670560 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:12Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.742463 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.742532 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.742600 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.742635 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.742658 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.827291 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:12 crc kubenswrapper[5025]: E1007 08:17:12.827480 5025 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:12 crc kubenswrapper[5025]: E1007 08:17:12.827604 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs podName:93fdeab4-b5d2-42d8-97ca-d5d61032e19f nodeName:}" failed. No retries permitted until 2025-10-07 08:17:16.827530585 +0000 UTC m=+43.636844769 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs") pod "network-metrics-daemon-f4ls7" (UID: "93fdeab4-b5d2-42d8-97ca-d5d61032e19f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.846325 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.846378 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.846391 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.846408 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.846419 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.914060 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:12 crc kubenswrapper[5025]: E1007 08:17:12.914376 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.949536 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.949629 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.949646 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.949674 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:12 crc kubenswrapper[5025]: I1007 08:17:12.949695 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:12Z","lastTransitionTime":"2025-10-07T08:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.053323 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.053383 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.053425 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.053444 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.053458 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.158036 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.158098 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.158110 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.158133 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.158146 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.262403 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.262489 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.262507 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.262542 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.262586 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.366711 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.366788 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.366810 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.366841 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.366860 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.470821 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.470921 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.470967 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.470992 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.471037 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.574773 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.574835 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.574854 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.574880 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.574898 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.677458 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.677496 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.677506 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.677518 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.677528 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.781164 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.781211 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.781223 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.781242 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.781255 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.884406 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.884465 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.884475 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.884488 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.884500 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.914926 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.915086 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:13 crc kubenswrapper[5025]: E1007 08:17:13.915211 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.915226 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:13 crc kubenswrapper[5025]: E1007 08:17:13.915431 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:13 crc kubenswrapper[5025]: E1007 08:17:13.915530 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.932581 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:13Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.947987 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:13Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.964201 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:13Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.981960 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:13Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.986532 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.986579 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.986589 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.986607 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:13 crc kubenswrapper[5025]: I1007 08:17:13.986618 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:13Z","lastTransitionTime":"2025-10-07T08:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.004110 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.028060 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.043154 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.060855 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.075595 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.091060 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.091135 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.091154 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.091184 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.091202 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:14Z","lastTransitionTime":"2025-10-07T08:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.093927 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.111591 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.129537 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.148871 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:05Z\\\",\\\"message\\\":\\\"== {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353289 6518 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353308 6518 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 08:17:05.351793 6518 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.169179 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.181796 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.194202 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.194228 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.194236 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.194249 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.194257 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:14Z","lastTransitionTime":"2025-10-07T08:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.194608 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:14Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.297101 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.297165 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.297186 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.297216 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.297236 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:14Z","lastTransitionTime":"2025-10-07T08:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.400662 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.400753 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.400775 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.400807 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.400829 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:14Z","lastTransitionTime":"2025-10-07T08:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.503925 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.503983 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.504000 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.504027 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.504045 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:14Z","lastTransitionTime":"2025-10-07T08:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.607242 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.607310 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.607324 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.607345 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.607361 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:14Z","lastTransitionTime":"2025-10-07T08:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.711434 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.711482 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.711496 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.711514 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.711526 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:14Z","lastTransitionTime":"2025-10-07T08:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.814287 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.814355 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.814373 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.814397 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.814414 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:14Z","lastTransitionTime":"2025-10-07T08:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.914343 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:14 crc kubenswrapper[5025]: E1007 08:17:14.914587 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.918000 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.918083 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.918097 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.918118 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:14 crc kubenswrapper[5025]: I1007 08:17:14.918473 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:14Z","lastTransitionTime":"2025-10-07T08:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.021820 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.021895 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.021915 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.021943 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.021965 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.125874 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.125935 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.125959 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.125989 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.126007 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.229231 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.229299 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.229318 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.229342 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.229362 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.332520 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.332627 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.332652 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.332676 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.332697 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.436266 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.436319 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.436336 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.436359 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.436379 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.539304 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.539377 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.539388 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.539406 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.539419 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.642768 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.642825 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.642844 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.642867 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.642885 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.747122 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.747213 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.747671 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.747738 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.747757 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.851116 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.851172 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.851194 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.851221 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.851243 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.914083 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:15 crc kubenswrapper[5025]: E1007 08:17:15.914280 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.914307 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.914386 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:15 crc kubenswrapper[5025]: E1007 08:17:15.914422 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:15 crc kubenswrapper[5025]: E1007 08:17:15.914630 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.954079 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.954156 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.954174 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.954199 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:15 crc kubenswrapper[5025]: I1007 08:17:15.954217 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:15Z","lastTransitionTime":"2025-10-07T08:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.057908 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.057955 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.057972 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.057996 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.058014 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.161277 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.161329 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.161343 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.161364 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.161378 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.264716 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.264772 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.264789 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.264813 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.264829 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.367035 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.367095 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.367123 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.367152 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.367176 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.469788 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.469841 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.469858 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.469882 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.469900 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.572247 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.572293 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.572312 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.572334 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.572354 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.675327 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.675383 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.675399 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.675421 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.675440 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.777995 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.778065 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.778084 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.778108 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.778124 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.871316 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:16 crc kubenswrapper[5025]: E1007 08:17:16.871616 5025 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:16 crc kubenswrapper[5025]: E1007 08:17:16.871782 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs podName:93fdeab4-b5d2-42d8-97ca-d5d61032e19f nodeName:}" failed. No retries permitted until 2025-10-07 08:17:24.871754249 +0000 UTC m=+51.681068433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs") pod "network-metrics-daemon-f4ls7" (UID: "93fdeab4-b5d2-42d8-97ca-d5d61032e19f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.881573 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.881630 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.881653 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.881683 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.881706 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.914169 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:16 crc kubenswrapper[5025]: E1007 08:17:16.914392 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.985053 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.985114 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.985130 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.985153 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:16 crc kubenswrapper[5025]: I1007 08:17:16.985173 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:16Z","lastTransitionTime":"2025-10-07T08:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.087228 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.087294 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.087311 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.087338 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.087355 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:17Z","lastTransitionTime":"2025-10-07T08:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.189645 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.189699 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.189715 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.189738 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.189755 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:17Z","lastTransitionTime":"2025-10-07T08:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.292860 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.292903 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.292916 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.292935 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.292947 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:17Z","lastTransitionTime":"2025-10-07T08:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.396453 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.396513 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.396531 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.396589 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.396618 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:17Z","lastTransitionTime":"2025-10-07T08:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.500125 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.500203 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.500226 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.500259 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.500279 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:17Z","lastTransitionTime":"2025-10-07T08:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.606212 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.606254 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.606280 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.606304 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.606320 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:17Z","lastTransitionTime":"2025-10-07T08:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.708853 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.708902 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.708920 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.708946 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.708963 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:17Z","lastTransitionTime":"2025-10-07T08:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.811383 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.811425 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.811437 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.811453 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.811463 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:17Z","lastTransitionTime":"2025-10-07T08:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.914062 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.914155 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.914070 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:17 crc kubenswrapper[5025]: E1007 08:17:17.914347 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:17 crc kubenswrapper[5025]: E1007 08:17:17.914486 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:17 crc kubenswrapper[5025]: E1007 08:17:17.914644 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.915640 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.915708 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.915732 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.915762 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:17 crc kubenswrapper[5025]: I1007 08:17:17.915786 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:17Z","lastTransitionTime":"2025-10-07T08:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.018154 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.018216 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.018235 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.018258 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.018275 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.121134 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.121200 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.121220 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.121251 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.121276 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.224746 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.224814 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.224832 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.224858 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.224882 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.327301 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.327371 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.327389 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.327421 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.327443 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.430478 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.430581 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.430604 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.430637 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.430661 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.534038 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.534107 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.534131 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.534165 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.534190 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.637712 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.637790 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.637818 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.637849 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.637874 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.741125 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.741185 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.741201 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.741226 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.741242 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.844032 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.844069 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.844077 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.844091 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.844100 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.914409 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:18 crc kubenswrapper[5025]: E1007 08:17:18.914691 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.916024 5025 scope.go:117] "RemoveContainer" containerID="15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.947047 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.947255 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.947354 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.947454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:18 crc kubenswrapper[5025]: I1007 08:17:18.947564 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:18Z","lastTransitionTime":"2025-10-07T08:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.051110 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.051170 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.051187 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.051215 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.051236 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.153688 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.153724 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.153733 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.153748 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.153757 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.256897 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.256958 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.256980 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.257009 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.257030 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.329570 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/1.log" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.333889 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.334694 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.360029 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.360067 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.360077 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.360093 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.360105 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.361806 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.387364 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.402472 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.416376 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.431202 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.448862 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:05Z\\\",\\\"message\\\":\\\"== {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353289 6518 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353308 6518 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 08:17:05.351793 6518 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.462134 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.462188 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.462204 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.462226 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.462241 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.465473 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.479271 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.499146 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.513390 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.530766 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.544770 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.564488 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.565080 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.565110 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.565118 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.565131 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.565140 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.579177 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.602728 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.620300 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.666852 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.666909 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.666927 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.666951 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.666968 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.769566 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.769599 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.769607 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.769620 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.769630 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.872805 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.872870 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.872888 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.872918 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.872940 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.913786 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:19 crc kubenswrapper[5025]: E1007 08:17:19.914148 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.914756 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:19 crc kubenswrapper[5025]: E1007 08:17:19.914860 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.914961 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:19 crc kubenswrapper[5025]: E1007 08:17:19.915049 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.976857 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.976928 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.976948 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.976975 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:19 crc kubenswrapper[5025]: I1007 08:17:19.977030 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:19Z","lastTransitionTime":"2025-10-07T08:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.080223 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.080287 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.080305 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.080329 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.080346 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.123970 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.124022 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.124041 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.124068 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.124086 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: E1007 08:17:20.146693 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.151803 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.151883 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.151903 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.151925 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.151980 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: E1007 08:17:20.175578 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.179699 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.179748 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.179766 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.179787 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.179807 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: E1007 08:17:20.195009 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.200319 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.200402 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.200419 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.200473 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.200490 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: E1007 08:17:20.221476 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.226998 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.227080 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.227097 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.227151 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.227167 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: E1007 08:17:20.250517 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: E1007 08:17:20.255906 5025 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.258185 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.258239 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.258257 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.258282 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.258301 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.340071 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/2.log" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.341283 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/1.log" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.345110 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e" exitCode=1 Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.345176 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.345246 5025 scope.go:117] "RemoveContainer" containerID="15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.346594 5025 scope.go:117] "RemoveContainer" containerID="18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e" Oct 07 08:17:20 crc kubenswrapper[5025]: E1007 08:17:20.346918 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.361984 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.362091 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.362112 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.362176 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.362197 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.367761 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.388978 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.408925 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.430615 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.451133 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.465373 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.465431 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.465447 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.465472 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.465489 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.483676 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f7224ba03bfadb0d7b4d4023f427da920f47b6d5f8d0d69db5b96baaedb0c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:05Z\\\",\\\"message\\\":\\\"== {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353289 6518 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 08:17:05.353308 6518 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 08:17:05.351793 6518 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.509910 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.532118 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.555826 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.568591 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.568660 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.568672 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.568692 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.568703 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.573243 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.590327 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.606887 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.625673 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.642788 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.658699 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.671721 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.671817 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.671839 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.671905 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.671927 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.675618 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:20Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.775508 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.775588 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.775598 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.775618 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.775629 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.878301 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.878383 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.878408 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.878438 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.878459 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.913943 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:20 crc kubenswrapper[5025]: E1007 08:17:20.914189 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.981738 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.981798 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.981817 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.981843 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:20 crc kubenswrapper[5025]: I1007 08:17:20.981862 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:20Z","lastTransitionTime":"2025-10-07T08:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.084171 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.084230 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.084246 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.084305 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.084322 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:21Z","lastTransitionTime":"2025-10-07T08:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.188000 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.188038 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.188049 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.188071 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.188083 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:21Z","lastTransitionTime":"2025-10-07T08:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.291490 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.291529 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.291566 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.291625 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.291641 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:21Z","lastTransitionTime":"2025-10-07T08:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.350505 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/2.log" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.354032 5025 scope.go:117] "RemoveContainer" containerID="18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e" Oct 07 08:17:21 crc kubenswrapper[5025]: E1007 08:17:21.354303 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.373413 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.387053 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.394707 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.394789 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.394811 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.395300 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.395356 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:21Z","lastTransitionTime":"2025-10-07T08:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.402873 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.414626 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.430093 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.448788 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.463965 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.483986 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.499660 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.499716 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.499739 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.499828 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.499855 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:21Z","lastTransitionTime":"2025-10-07T08:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.501947 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.521135 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.539629 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.557329 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.580025 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.596331 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.603778 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.603830 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.603847 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.603872 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.603889 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:21Z","lastTransitionTime":"2025-10-07T08:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.615480 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.632993 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.706057 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.706095 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.706104 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.706118 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.706127 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:21Z","lastTransitionTime":"2025-10-07T08:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.808896 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.808940 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.808957 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.808979 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.809029 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:21Z","lastTransitionTime":"2025-10-07T08:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.856653 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.869280 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.880971 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.901992 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.912208 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.912264 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.912280 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.912305 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.912322 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:21Z","lastTransitionTime":"2025-10-07T08:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.914525 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.914582 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:21 crc kubenswrapper[5025]: E1007 08:17:21.914701 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.914761 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:21 crc kubenswrapper[5025]: E1007 08:17:21.914870 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:21 crc kubenswrapper[5025]: E1007 08:17:21.914971 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.929061 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.949064 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.964618 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.982443 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:21 crc kubenswrapper[5025]: I1007 08:17:21.997932 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:21Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.012983 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:22Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.015328 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.015379 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.015401 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.015435 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.015458 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.034210 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:22Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.054991 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:22Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.077203 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:22Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.098300 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:22Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.117955 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.118017 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.118042 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.118074 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.118100 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.125565 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:22Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.147481 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:22Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.163267 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:22Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.179572 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:22Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.221119 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.221155 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.221166 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.221183 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.221194 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.323845 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.323906 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.323926 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.323950 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.323968 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.427029 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.427092 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.427108 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.427133 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.427148 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.530975 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.531033 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.531143 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.531170 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.531191 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.633706 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.633746 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.633755 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.633770 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.633781 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.736829 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.736875 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.736886 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.736903 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.736913 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.839434 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.839480 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.839491 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.839507 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.839517 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.914408 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:22 crc kubenswrapper[5025]: E1007 08:17:22.914864 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.941957 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.942004 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.942015 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.942034 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:22 crc kubenswrapper[5025]: I1007 08:17:22.942048 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:22Z","lastTransitionTime":"2025-10-07T08:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.045603 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.045669 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.045688 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.045719 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.045738 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.148628 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.148676 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.148690 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.148708 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.148721 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.251610 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.251706 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.251718 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.251736 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.251748 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.354881 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.354935 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.354945 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.354962 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.354974 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.458124 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.458177 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.458189 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.458209 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.458223 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.562471 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.562534 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.562589 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.562617 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.562637 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.666018 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.666073 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.666088 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.666109 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.666126 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.769151 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.769220 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.769245 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.769276 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.769297 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.872302 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.872375 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.872401 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.872434 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.872457 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.913988 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.914155 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:23 crc kubenswrapper[5025]: E1007 08:17:23.914329 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.914376 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:23 crc kubenswrapper[5025]: E1007 08:17:23.914601 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:23 crc kubenswrapper[5025]: E1007 08:17:23.914957 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.939419 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:23Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.962040 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:23Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.975806 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.975858 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.975876 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.975899 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.975915 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:23Z","lastTransitionTime":"2025-10-07T08:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.979817 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:23Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:23 crc kubenswrapper[5025]: I1007 08:17:23.994741 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:23Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.006538 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.021836 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.035906 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.049137 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.061483 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.077860 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.077931 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.077953 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.077991 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.078012 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:24Z","lastTransitionTime":"2025-10-07T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.080382 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.094418 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.111442 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.141269 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.157669 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.176499 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.179989 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.180029 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.180044 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.180062 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.180077 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:24Z","lastTransitionTime":"2025-10-07T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.195867 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.208756 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:24Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.283624 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.283724 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.283745 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.283808 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.283829 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:24Z","lastTransitionTime":"2025-10-07T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.386907 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.386980 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.387002 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.387028 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.387046 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:24Z","lastTransitionTime":"2025-10-07T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.490430 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.490520 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.490559 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.490583 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.490600 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:24Z","lastTransitionTime":"2025-10-07T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.593722 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.593760 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.593771 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.593786 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.593798 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:24Z","lastTransitionTime":"2025-10-07T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.697811 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.697878 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.697902 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.697930 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.697953 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:24Z","lastTransitionTime":"2025-10-07T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.800559 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.800594 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.800603 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.800625 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.800635 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:24Z","lastTransitionTime":"2025-10-07T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.904159 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.904239 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.904260 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.904290 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.904311 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:24Z","lastTransitionTime":"2025-10-07T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.913659 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:24 crc kubenswrapper[5025]: E1007 08:17:24.913931 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:24 crc kubenswrapper[5025]: I1007 08:17:24.952937 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:24 crc kubenswrapper[5025]: E1007 08:17:24.953173 5025 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:24 crc kubenswrapper[5025]: E1007 08:17:24.953316 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs podName:93fdeab4-b5d2-42d8-97ca-d5d61032e19f nodeName:}" failed. No retries permitted until 2025-10-07 08:17:40.953269193 +0000 UTC m=+67.762583347 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs") pod "network-metrics-daemon-f4ls7" (UID: "93fdeab4-b5d2-42d8-97ca-d5d61032e19f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.007015 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.007085 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.007107 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.007132 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.007146 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.110061 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.110126 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.110138 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.110159 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.110172 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.212274 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.212356 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.212369 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.212388 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.212405 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.315673 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.315753 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.315773 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.315799 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.315819 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.418593 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.418623 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.418632 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.418647 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.418656 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.521564 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.521657 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.521668 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.521685 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.521694 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.632978 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.633040 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.633053 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.633073 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.633088 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.661055 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.661245 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:17:57.661202575 +0000 UTC m=+84.470516759 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.661328 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.661496 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.661610 5025 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.661701 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:57.661676751 +0000 UTC m=+84.470990935 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.661720 5025 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.661811 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:57.661789654 +0000 UTC m=+84.471103798 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.736359 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.736399 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.736407 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.736426 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.736438 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.762881 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.762957 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.763230 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.763271 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.763290 5025 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.763327 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.763397 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.763426 5025 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.763370 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:57.763347555 +0000 UTC m=+84.572661729 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.763717 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 08:17:57.763674426 +0000 UTC m=+84.572988660 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.839591 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.839665 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.839675 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.839690 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.839701 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.914006 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.914096 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.914186 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.914490 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.914635 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:25 crc kubenswrapper[5025]: E1007 08:17:25.914716 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.941984 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.942048 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.942066 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.942092 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:25 crc kubenswrapper[5025]: I1007 08:17:25.942112 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:25Z","lastTransitionTime":"2025-10-07T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.045419 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.045482 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.045504 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.045611 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.045659 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.148673 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.148719 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.148731 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.148749 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.148764 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.251787 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.251822 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.251832 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.251845 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.251855 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.354967 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.355002 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.355010 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.355023 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.355033 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.457527 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.457604 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.457613 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.457626 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.457636 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.560687 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.560756 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.560773 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.560797 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.560818 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.663675 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.663723 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.663738 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.663760 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.663775 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.767236 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.767305 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.767326 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.767356 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.767379 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.870983 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.871158 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.871184 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.871212 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.871235 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.914114 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:26 crc kubenswrapper[5025]: E1007 08:17:26.914323 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.974223 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.974297 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.974320 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.974348 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:26 crc kubenswrapper[5025]: I1007 08:17:26.974368 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:26Z","lastTransitionTime":"2025-10-07T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.078015 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.078102 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.078123 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.078151 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.078167 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:27Z","lastTransitionTime":"2025-10-07T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.182534 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.182649 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.182671 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.182705 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.182733 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:27Z","lastTransitionTime":"2025-10-07T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.285747 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.285837 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.285850 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.285868 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.285886 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:27Z","lastTransitionTime":"2025-10-07T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.388955 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.388996 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.389005 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.389019 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.389030 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:27Z","lastTransitionTime":"2025-10-07T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.492345 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.492409 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.492421 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.492439 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.492449 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:27Z","lastTransitionTime":"2025-10-07T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.595739 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.595807 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.595848 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.595883 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.595906 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:27Z","lastTransitionTime":"2025-10-07T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.699088 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.699159 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.699178 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.699207 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.699226 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:27Z","lastTransitionTime":"2025-10-07T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.803285 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.803371 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.803393 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.803489 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.803535 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:27Z","lastTransitionTime":"2025-10-07T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.907116 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.907175 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.907186 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.907206 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.907220 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:27Z","lastTransitionTime":"2025-10-07T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.918687 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.918716 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:27 crc kubenswrapper[5025]: I1007 08:17:27.918750 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:27 crc kubenswrapper[5025]: E1007 08:17:27.918823 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:27 crc kubenswrapper[5025]: E1007 08:17:27.919020 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:27 crc kubenswrapper[5025]: E1007 08:17:27.919155 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.010265 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.010317 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.010326 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.010348 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.010359 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.113473 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.113578 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.113598 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.113625 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.113645 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.216408 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.216490 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.216509 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.216533 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.216648 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.319299 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.319382 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.319399 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.319422 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.319438 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.422474 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.422521 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.422573 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.422596 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.422609 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.525582 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.525663 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.525692 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.525729 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.525752 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.628779 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.628854 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.628872 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.628901 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.628922 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.732583 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.732632 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.732644 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.732665 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.732679 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.835895 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.835932 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.835944 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.835963 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.835977 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.914089 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:28 crc kubenswrapper[5025]: E1007 08:17:28.914231 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.938334 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.938382 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.938394 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.938412 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:28 crc kubenswrapper[5025]: I1007 08:17:28.938423 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:28Z","lastTransitionTime":"2025-10-07T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.042850 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.042894 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.042905 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.042920 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.042931 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.146054 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.146110 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.146128 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.146152 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.146168 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.249509 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.249576 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.249589 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.249610 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.249622 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.352425 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.352486 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.352503 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.352529 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.352576 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.455406 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.455453 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.455465 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.455482 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.455496 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.558719 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.558777 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.558793 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.558817 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.558836 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.661812 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.661856 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.661867 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.661884 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.661896 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.765075 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.765126 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.765142 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.765166 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.765182 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.868672 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.868719 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.868734 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.868758 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.868774 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.913528 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.913674 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.913816 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:29 crc kubenswrapper[5025]: E1007 08:17:29.913799 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:29 crc kubenswrapper[5025]: E1007 08:17:29.914045 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:29 crc kubenswrapper[5025]: E1007 08:17:29.914160 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.972331 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.972397 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.972415 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.972444 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:29 crc kubenswrapper[5025]: I1007 08:17:29.972465 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:29Z","lastTransitionTime":"2025-10-07T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.074977 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.075039 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.075057 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.075081 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.075101 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.178125 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.178151 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.178159 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.178172 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.178181 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.281573 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.281607 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.281615 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.281630 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.281640 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.386883 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.386942 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.386962 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.386988 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.387006 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.489670 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.489706 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.489713 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.489727 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.489735 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.518219 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.518280 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.518297 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.518324 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.518346 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: E1007 08:17:30.532474 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:30Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.537150 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.537231 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.537258 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.537304 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.537327 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: E1007 08:17:30.551962 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:30Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.556805 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.556868 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.556885 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.556915 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.556933 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: E1007 08:17:30.576012 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:30Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.580269 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.580324 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.580343 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.580373 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.580394 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: E1007 08:17:30.596301 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:30Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.600579 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.600633 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.600649 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.600673 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.600689 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: E1007 08:17:30.615320 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:30Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:30 crc kubenswrapper[5025]: E1007 08:17:30.615495 5025 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.617885 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.617918 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.617928 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.617944 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.617953 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.720888 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.720938 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.720949 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.720965 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.720974 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.823492 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.823575 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.823589 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.823618 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.823637 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.914071 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:30 crc kubenswrapper[5025]: E1007 08:17:30.914352 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.926607 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.926647 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.926676 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.926697 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:30 crc kubenswrapper[5025]: I1007 08:17:30.926713 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:30Z","lastTransitionTime":"2025-10-07T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.029868 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.029957 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.029979 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.030014 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.030035 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.133123 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.133178 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.133193 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.133215 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.133233 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.240779 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.241434 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.241459 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.241481 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.241494 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.344250 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.344291 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.344300 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.344313 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.344323 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.447661 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.447863 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.447880 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.447899 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.447914 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.550639 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.550686 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.550697 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.550714 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.550726 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.653108 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.653152 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.653172 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.653203 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.653227 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.756660 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.756713 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.756722 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.756736 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.756964 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.860863 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.860920 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.860939 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.860959 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.860974 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.914052 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:31 crc kubenswrapper[5025]: E1007 08:17:31.914216 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.914051 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.914278 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:31 crc kubenswrapper[5025]: E1007 08:17:31.914424 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:31 crc kubenswrapper[5025]: E1007 08:17:31.914578 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.963513 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.963600 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.963612 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.963629 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:31 crc kubenswrapper[5025]: I1007 08:17:31.963643 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:31Z","lastTransitionTime":"2025-10-07T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.066150 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.066209 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.066226 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.066245 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.066261 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.169067 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.169121 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.169133 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.169149 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.169161 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.271784 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.271850 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.271869 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.271891 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.271906 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.374331 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.374398 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.374416 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.374439 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.374459 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.477092 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.477160 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.477178 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.477205 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.477226 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.580950 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.580997 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.581008 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.581026 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.581037 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.683949 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.683997 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.684007 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.684022 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.684033 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.786611 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.786685 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.786705 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.786756 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.786780 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.889379 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.889483 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.889505 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.889526 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.889569 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.914211 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:32 crc kubenswrapper[5025]: E1007 08:17:32.914365 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.992714 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.992752 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.992761 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.992776 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:32 crc kubenswrapper[5025]: I1007 08:17:32.992786 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:32Z","lastTransitionTime":"2025-10-07T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.095966 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.096014 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.096031 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.096052 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.096070 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:33Z","lastTransitionTime":"2025-10-07T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.198869 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.198912 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.198924 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.198944 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.198955 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:33Z","lastTransitionTime":"2025-10-07T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.302041 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.302112 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.302130 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.302154 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.302172 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:33Z","lastTransitionTime":"2025-10-07T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.404197 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.404262 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.404279 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.404304 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.404322 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:33Z","lastTransitionTime":"2025-10-07T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.507652 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.507724 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.507752 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.507834 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.507856 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:33Z","lastTransitionTime":"2025-10-07T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.610271 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.610306 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.610315 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.610507 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.610517 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:33Z","lastTransitionTime":"2025-10-07T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.713421 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.713486 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.713504 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.713529 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.713571 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:33Z","lastTransitionTime":"2025-10-07T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.816187 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.816213 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.816221 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.816253 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.816264 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:33Z","lastTransitionTime":"2025-10-07T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.914210 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.914336 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.914785 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:33 crc kubenswrapper[5025]: E1007 08:17:33.915027 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.915108 5025 scope.go:117] "RemoveContainer" containerID="18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e" Oct 07 08:17:33 crc kubenswrapper[5025]: E1007 08:17:33.915218 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:33 crc kubenswrapper[5025]: E1007 08:17:33.915427 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:33 crc kubenswrapper[5025]: E1007 08:17:33.915466 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.918978 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.919033 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.919056 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.919084 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.919106 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:33Z","lastTransitionTime":"2025-10-07T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.935097 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:33Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.963490 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:33Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:33 crc kubenswrapper[5025]: I1007 08:17:33.983893 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:33Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.009663 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.020824 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.020944 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.020974 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.021074 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.021156 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.037672 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.052868 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.067843 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.086192 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.099757 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.114463 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.125946 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.125992 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.126007 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.126028 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.126044 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.129286 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.145240 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.159068 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.177296 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.195476 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.211653 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.228481 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.228524 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.228579 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.228605 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.228623 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.232324 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:34Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.332185 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.332250 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.332263 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.332279 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.332293 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.435257 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.435590 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.435612 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.435631 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.435676 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.538474 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.538730 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.538805 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.538898 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.538967 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.641068 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.641101 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.641109 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.641123 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.641131 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.744714 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.744788 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.744814 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.744843 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.744865 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.848710 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.848771 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.848780 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.848796 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.848805 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.913677 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:34 crc kubenswrapper[5025]: E1007 08:17:34.913891 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.952116 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.952189 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.952209 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.952239 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:34 crc kubenswrapper[5025]: I1007 08:17:34.952258 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:34Z","lastTransitionTime":"2025-10-07T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.055041 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.055093 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.055109 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.055130 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.055144 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.157097 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.157131 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.157142 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.157160 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.157172 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.261242 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.261364 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.261384 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.261412 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.261439 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.364089 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.364262 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.364279 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.364300 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.364316 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.467326 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.467382 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.467403 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.467429 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.467447 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.571043 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.571114 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.571146 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.571177 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.571201 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.675139 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.675221 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.675245 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.675277 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.675297 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.780009 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.780209 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.780237 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.780272 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.780299 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.884010 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.884051 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.884063 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.884084 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.884098 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.914821 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.914842 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:35 crc kubenswrapper[5025]: E1007 08:17:35.914975 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.915024 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:35 crc kubenswrapper[5025]: E1007 08:17:35.915129 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:35 crc kubenswrapper[5025]: E1007 08:17:35.915226 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.986379 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.986442 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.986463 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.986484 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:35 crc kubenswrapper[5025]: I1007 08:17:35.986497 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:35Z","lastTransitionTime":"2025-10-07T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.088800 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.088867 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.088890 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.088918 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.088939 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:36Z","lastTransitionTime":"2025-10-07T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.192215 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.192275 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.192293 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.192317 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.192335 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:36Z","lastTransitionTime":"2025-10-07T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.295026 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.295067 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.295076 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.295092 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.295100 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:36Z","lastTransitionTime":"2025-10-07T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.397437 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.397473 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.397482 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.397495 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.397504 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:36Z","lastTransitionTime":"2025-10-07T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.499951 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.500008 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.500033 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.500063 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.500084 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:36Z","lastTransitionTime":"2025-10-07T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.603063 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.603115 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.603132 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.603154 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.603172 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:36Z","lastTransitionTime":"2025-10-07T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.718982 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.719031 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.719043 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.719059 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.719070 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:36Z","lastTransitionTime":"2025-10-07T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.821778 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.821855 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.821884 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.821915 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.821937 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:36Z","lastTransitionTime":"2025-10-07T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.914225 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:36 crc kubenswrapper[5025]: E1007 08:17:36.914482 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.924485 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.924538 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.924588 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.924612 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:36 crc kubenswrapper[5025]: I1007 08:17:36.924628 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:36Z","lastTransitionTime":"2025-10-07T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.027311 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.027368 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.027386 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.027410 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.027427 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.129492 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.129586 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.129610 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.129642 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.129664 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.232237 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.232277 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.232286 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.232303 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.232312 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.334963 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.335020 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.335044 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.335074 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.335094 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.437593 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.437630 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.437640 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.437654 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.437666 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.540927 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.540986 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.541007 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.541034 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.541055 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.643993 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.644026 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.644033 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.644048 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.644060 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.748164 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.748218 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.748255 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.748285 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.748318 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.851043 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.851084 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.851094 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.851110 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.851120 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.913893 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:37 crc kubenswrapper[5025]: E1007 08:17:37.914085 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.913922 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.913908 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:37 crc kubenswrapper[5025]: E1007 08:17:37.914208 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:37 crc kubenswrapper[5025]: E1007 08:17:37.914329 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.953402 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.953475 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.953500 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.953529 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:37 crc kubenswrapper[5025]: I1007 08:17:37.953589 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:37Z","lastTransitionTime":"2025-10-07T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.056056 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.056135 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.056161 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.056189 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.056210 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.158453 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.158492 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.158502 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.158516 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.158527 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.260404 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.260443 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.260454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.260491 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.260502 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.363208 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.363250 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.363258 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.363293 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.363303 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.465075 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.465119 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.465134 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.465154 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.465168 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.567943 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.567977 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.567986 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.567998 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.568008 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.670416 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.670454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.670464 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.670478 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.670489 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.772921 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.772948 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.772956 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.772969 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.772979 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.894022 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.894077 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.894094 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.894116 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.894134 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.914633 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:38 crc kubenswrapper[5025]: E1007 08:17:38.914887 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.996976 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.997008 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.997019 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.997036 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:38 crc kubenswrapper[5025]: I1007 08:17:38.997049 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:38Z","lastTransitionTime":"2025-10-07T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.099995 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.100055 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.100075 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.100100 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.100117 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:39Z","lastTransitionTime":"2025-10-07T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.203451 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.203532 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.203620 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.203654 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.203679 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:39Z","lastTransitionTime":"2025-10-07T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.305974 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.306020 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.306032 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.306050 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.306059 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:39Z","lastTransitionTime":"2025-10-07T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.408371 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.408415 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.408427 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.408452 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.408464 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:39Z","lastTransitionTime":"2025-10-07T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.511818 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.511881 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.511900 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.511924 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.511942 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:39Z","lastTransitionTime":"2025-10-07T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.615458 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.615507 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.615522 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.615560 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.615574 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:39Z","lastTransitionTime":"2025-10-07T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.718690 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.718734 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.718747 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.718767 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.718778 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:39Z","lastTransitionTime":"2025-10-07T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.821139 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.821210 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.821224 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.821256 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.821276 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:39Z","lastTransitionTime":"2025-10-07T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.914299 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.914404 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.914299 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:39 crc kubenswrapper[5025]: E1007 08:17:39.914526 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:39 crc kubenswrapper[5025]: E1007 08:17:39.914660 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:39 crc kubenswrapper[5025]: E1007 08:17:39.914822 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.924324 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.924366 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.924378 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.924395 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:39 crc kubenswrapper[5025]: I1007 08:17:39.924407 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:39Z","lastTransitionTime":"2025-10-07T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.027365 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.027398 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.027407 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.027419 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.027429 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.130042 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.130086 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.130104 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.130124 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.130141 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.232835 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.232879 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.232891 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.232907 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.232919 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.335746 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.335805 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.335825 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.335849 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.335866 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.438003 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.438052 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.438067 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.438086 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.438101 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.540670 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.540708 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.540719 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.540737 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.540775 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.643882 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.643984 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.644012 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.644046 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.644065 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.746361 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.746396 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.746408 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.746422 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.746433 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.849717 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.849781 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.849797 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.849824 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.849837 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.914977 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:40 crc kubenswrapper[5025]: E1007 08:17:40.915212 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.924787 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.953363 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.953411 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.953424 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.953444 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.953458 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.964499 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.964582 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.964609 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.964631 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.964656 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.965984 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:40 crc kubenswrapper[5025]: E1007 08:17:40.966237 5025 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:40 crc kubenswrapper[5025]: E1007 08:17:40.966371 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs podName:93fdeab4-b5d2-42d8-97ca-d5d61032e19f nodeName:}" failed. No retries permitted until 2025-10-07 08:18:12.966332568 +0000 UTC m=+99.775646752 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs") pod "network-metrics-daemon-f4ls7" (UID: "93fdeab4-b5d2-42d8-97ca-d5d61032e19f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:17:40 crc kubenswrapper[5025]: E1007 08:17:40.983795 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:40Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.988785 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.988823 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.988837 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.988852 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:40 crc kubenswrapper[5025]: I1007 08:17:40.988863 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:40Z","lastTransitionTime":"2025-10-07T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: E1007 08:17:41.002279 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:40Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.005412 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.005444 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.005456 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.005472 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.005484 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: E1007 08:17:41.017012 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:41Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.020561 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.020585 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.020597 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.020613 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.020625 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: E1007 08:17:41.033092 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:41Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.036278 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.036309 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.036323 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.036339 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.036353 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: E1007 08:17:41.055584 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:41Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:41 crc kubenswrapper[5025]: E1007 08:17:41.055820 5025 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.057359 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.057388 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.057401 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.057419 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.057431 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.160163 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.160211 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.160231 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.160253 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.160268 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.262524 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.262575 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.262588 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.262607 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.262625 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.366360 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.366394 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.366403 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.366417 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.366428 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.469487 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.469515 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.469523 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.469536 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.469566 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.572769 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.572812 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.572824 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.572839 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.572850 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.675340 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.675382 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.675396 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.675413 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.675426 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.778112 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.778170 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.778185 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.778200 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.778211 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.880574 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.880617 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.880629 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.880647 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.880660 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.914306 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.914313 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:41 crc kubenswrapper[5025]: E1007 08:17:41.914423 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.914459 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:41 crc kubenswrapper[5025]: E1007 08:17:41.914533 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:41 crc kubenswrapper[5025]: E1007 08:17:41.914700 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.982742 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.982784 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.982797 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.982816 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:41 crc kubenswrapper[5025]: I1007 08:17:41.982828 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:41Z","lastTransitionTime":"2025-10-07T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.085308 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.085400 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.085432 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.085477 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.085501 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:42Z","lastTransitionTime":"2025-10-07T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.189311 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.189398 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.189422 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.189454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.189481 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:42Z","lastTransitionTime":"2025-10-07T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.292691 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.292745 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.292763 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.292786 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.292802 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:42Z","lastTransitionTime":"2025-10-07T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.396397 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.396451 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.396467 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.396494 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.396512 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:42Z","lastTransitionTime":"2025-10-07T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.436007 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/0.log" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.436113 5025 generic.go:334] "Generic (PLEG): container finished" podID="34b07a69-1bbf-4019-b824-7b5be0f9404d" containerID="aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11" exitCode=1 Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.436180 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmhw6" event={"ID":"34b07a69-1bbf-4019-b824-7b5be0f9404d","Type":"ContainerDied","Data":"aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.437285 5025 scope.go:117] "RemoveContainer" containerID="aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.460983 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.476054 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42b679f5-3b2e-42bf-9dde-970af25b4297\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a35a698f6fc982aef52becc7aaf63bb0cca25bf9dab6f23bcbd51712b3dbc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.490397 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.499682 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.499715 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.499730 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.499747 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.499759 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:42Z","lastTransitionTime":"2025-10-07T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.508413 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.523076 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.537076 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.552883 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.568666 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.582834 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.602222 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.602280 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.602328 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.602350 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.602450 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:42Z","lastTransitionTime":"2025-10-07T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.605199 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.621324 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.640926 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.658357 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.674935 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.691591 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.704489 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.704664 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.704725 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.704795 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.704862 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:42Z","lastTransitionTime":"2025-10-07T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.708421 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"2025-10-07T08:16:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b\\\\n2025-10-07T08:16:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b to /host/opt/cni/bin/\\\\n2025-10-07T08:16:57Z [verbose] multus-daemon started\\\\n2025-10-07T08:16:57Z [verbose] Readiness Indicator file check\\\\n2025-10-07T08:17:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.728191 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.745986 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:42Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.808115 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.808218 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.808237 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.808297 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.808323 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:42Z","lastTransitionTime":"2025-10-07T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.910976 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.911177 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.911277 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.911348 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.911413 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:42Z","lastTransitionTime":"2025-10-07T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:42 crc kubenswrapper[5025]: I1007 08:17:42.914584 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:42 crc kubenswrapper[5025]: E1007 08:17:42.914773 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.014780 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.014833 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.014851 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.014876 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.014890 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.118081 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.118134 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.118146 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.118168 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.118180 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.220162 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.220247 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.220261 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.220286 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.220314 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.324172 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.324233 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.324256 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.324286 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.324308 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.426588 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.426630 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.426643 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.426661 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.426673 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.441436 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/0.log" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.441519 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmhw6" event={"ID":"34b07a69-1bbf-4019-b824-7b5be0f9404d","Type":"ContainerStarted","Data":"79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.454387 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.475104 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.487141 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.505581 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.515242 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.524508 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.528367 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.528406 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.528416 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.528432 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.528442 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.535449 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.546484 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.561395 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.575083 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.593300 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"2025-10-07T08:16:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b\\\\n2025-10-07T08:16:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b to /host/opt/cni/bin/\\\\n2025-10-07T08:16:57Z [verbose] multus-daemon started\\\\n2025-10-07T08:16:57Z [verbose] Readiness Indicator file check\\\\n2025-10-07T08:17:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.610877 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.631616 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.631650 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.631661 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.631682 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.631694 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.631664 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.666258 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.686933 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42b679f5-3b2e-42bf-9dde-970af25b4297\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a35a698f6fc982aef52becc7aaf63bb0cca25bf9dab6f23bcbd51712b3dbc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.707437 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.727704 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.734206 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.734266 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.734285 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.734310 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.734329 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.748188 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.837657 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.837757 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.837766 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.837781 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.837793 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.913622 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.913670 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:43 crc kubenswrapper[5025]: E1007 08:17:43.913785 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:43 crc kubenswrapper[5025]: E1007 08:17:43.913942 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.914059 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:43 crc kubenswrapper[5025]: E1007 08:17:43.914109 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.937287 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.940349 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.940377 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.940387 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.940402 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.940416 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:43Z","lastTransitionTime":"2025-10-07T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.956036 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.972782 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:43 crc kubenswrapper[5025]: I1007 08:17:43.988520 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.001052 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:43Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.020294 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.035775 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.043478 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.043572 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.043593 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.043619 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.043638 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.055525 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.073650 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.090080 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.107227 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.129455 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.146645 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.147326 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.147379 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.147392 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.147412 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.147424 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.162490 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.181800 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"2025-10-07T08:16:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b\\\\n2025-10-07T08:16:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b to /host/opt/cni/bin/\\\\n2025-10-07T08:16:57Z [verbose] multus-daemon started\\\\n2025-10-07T08:16:57Z [verbose] Readiness Indicator file check\\\\n2025-10-07T08:17:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.194317 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42b679f5-3b2e-42bf-9dde-970af25b4297\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a35a698f6fc982aef52becc7aaf63bb0cca25bf9dab6f23bcbd51712b3dbc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.210862 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.237241 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:44Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.250478 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.250683 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.250754 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.250792 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.250815 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.355356 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.355397 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.355406 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.355422 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.355432 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.458306 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.458347 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.458356 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.458370 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.458379 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.561201 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.561264 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.561284 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.561308 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.561324 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.664502 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.664712 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.664739 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.664773 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.664794 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.766700 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.766732 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.766761 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.766775 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.766784 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.869170 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.869210 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.869218 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.869234 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.869243 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.913828 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:44 crc kubenswrapper[5025]: E1007 08:17:44.913985 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.971953 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.971985 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.971995 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.972009 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:44 crc kubenswrapper[5025]: I1007 08:17:44.972018 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:44Z","lastTransitionTime":"2025-10-07T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.074387 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.074427 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.074438 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.074456 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.074467 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:45Z","lastTransitionTime":"2025-10-07T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.177370 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.177427 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.177436 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.177452 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.177464 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:45Z","lastTransitionTime":"2025-10-07T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.280220 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.280281 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.280311 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.280356 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.280379 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:45Z","lastTransitionTime":"2025-10-07T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.383768 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.384675 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.384714 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.384737 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.384750 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:45Z","lastTransitionTime":"2025-10-07T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.487394 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.487837 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.488001 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.488138 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.488278 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:45Z","lastTransitionTime":"2025-10-07T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.591873 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.591924 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.591934 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.591949 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.591960 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:45Z","lastTransitionTime":"2025-10-07T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.694633 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.694681 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.694698 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.694722 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.694739 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:45Z","lastTransitionTime":"2025-10-07T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.797961 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.798029 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.798054 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.798084 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.798117 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:45Z","lastTransitionTime":"2025-10-07T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.900722 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.900755 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.900764 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.900778 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.900788 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:45Z","lastTransitionTime":"2025-10-07T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.914028 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.914051 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:45 crc kubenswrapper[5025]: I1007 08:17:45.914078 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:45 crc kubenswrapper[5025]: E1007 08:17:45.914135 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:45 crc kubenswrapper[5025]: E1007 08:17:45.914249 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:45 crc kubenswrapper[5025]: E1007 08:17:45.914355 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.002726 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.002766 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.002776 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.002791 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.002800 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.105377 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.105432 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.105449 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.105471 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.105487 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.208002 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.208054 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.208071 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.208093 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.208110 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.310972 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.311032 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.311054 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.311083 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.311102 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.413418 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.413459 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.413471 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.413494 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.413505 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.515628 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.515670 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.515679 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.515693 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.515703 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.619171 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.619233 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.619250 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.619277 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.619295 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.722368 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.722431 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.722449 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.722473 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.722489 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.825136 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.825193 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.825211 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.825235 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.825255 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.914060 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:46 crc kubenswrapper[5025]: E1007 08:17:46.914186 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.929497 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.930048 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.930252 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.930292 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:46 crc kubenswrapper[5025]: I1007 08:17:46.930312 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:46Z","lastTransitionTime":"2025-10-07T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.032960 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.032991 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.032999 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.033011 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.033020 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.135162 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.135233 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.135257 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.135289 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.135311 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.237375 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.237427 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.237436 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.237453 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.237463 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.341030 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.341077 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.341094 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.341116 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.341133 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.444382 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.444436 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.444450 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.444468 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.444480 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.546920 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.546951 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.546959 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.546974 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.546986 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.648886 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.648918 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.648926 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.648938 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.648948 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.751078 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.751147 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.751164 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.751200 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.751220 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.853759 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.853789 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.853799 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.853812 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.853822 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.913625 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.913688 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.913687 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:47 crc kubenswrapper[5025]: E1007 08:17:47.913838 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:47 crc kubenswrapper[5025]: E1007 08:17:47.913950 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:47 crc kubenswrapper[5025]: E1007 08:17:47.914034 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.956992 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.957055 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.957078 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.957108 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:47 crc kubenswrapper[5025]: I1007 08:17:47.957130 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:47Z","lastTransitionTime":"2025-10-07T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.059982 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.060019 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.060028 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.060045 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.060057 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.162731 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.162792 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.162813 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.162839 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.162861 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.265710 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.265759 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.265779 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.265803 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.265821 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.367482 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.367507 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.367516 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.367528 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.367535 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.470475 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.470532 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.470589 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.470614 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.470632 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.573504 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.573598 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.573616 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.573641 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.573659 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.676402 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.676444 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.676489 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.676503 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.676511 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.778894 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.778961 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.778979 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.779012 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.779029 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.881577 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.881610 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.881620 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.881634 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.881644 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.913760 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:48 crc kubenswrapper[5025]: E1007 08:17:48.914367 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.914575 5025 scope.go:117] "RemoveContainer" containerID="18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.984205 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.984240 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.984249 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.984262 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:48 crc kubenswrapper[5025]: I1007 08:17:48.984271 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:48Z","lastTransitionTime":"2025-10-07T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.087141 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.087212 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.087233 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.087257 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.087274 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:49Z","lastTransitionTime":"2025-10-07T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.188946 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.188979 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.188990 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.189007 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.189019 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:49Z","lastTransitionTime":"2025-10-07T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.291355 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.291398 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.291409 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.291426 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.291437 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:49Z","lastTransitionTime":"2025-10-07T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.393750 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.393788 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.393798 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.393815 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.393825 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:49Z","lastTransitionTime":"2025-10-07T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.483822 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/2.log" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.486282 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.487067 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.496739 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.496771 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.496785 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.496802 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.496815 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:49Z","lastTransitionTime":"2025-10-07T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.511064 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.531300 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.553666 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.571785 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.592214 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.599900 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.599953 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.599963 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.599981 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.599991 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:49Z","lastTransitionTime":"2025-10-07T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.609306 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.625902 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.641341 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.658383 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.672948 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.687752 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.703371 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.703433 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.703444 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.703465 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.703478 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:49Z","lastTransitionTime":"2025-10-07T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.704289 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.718810 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.736898 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"2025-10-07T08:16:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b\\\\n2025-10-07T08:16:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b to /host/opt/cni/bin/\\\\n2025-10-07T08:16:57Z [verbose] multus-daemon started\\\\n2025-10-07T08:16:57Z [verbose] Readiness Indicator file check\\\\n2025-10-07T08:17:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.755035 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.769892 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.836664 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.836758 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.836783 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.836821 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.836841 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:49Z","lastTransitionTime":"2025-10-07T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.838683 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.853855 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42b679f5-3b2e-42bf-9dde-970af25b4297\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a35a698f6fc982aef52becc7aaf63bb0cca25bf9dab6f23bcbd51712b3dbc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:49Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.914191 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.914226 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.914302 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:49 crc kubenswrapper[5025]: E1007 08:17:49.914399 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:49 crc kubenswrapper[5025]: E1007 08:17:49.914645 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:49 crc kubenswrapper[5025]: E1007 08:17:49.914959 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.939664 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.939726 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.939745 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.939774 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:49 crc kubenswrapper[5025]: I1007 08:17:49.939792 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:49Z","lastTransitionTime":"2025-10-07T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.042949 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.043039 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.043063 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.043096 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.043120 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.146949 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.147062 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.147089 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.147120 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.147140 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.250254 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.250316 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.250328 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.250351 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.250364 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.353182 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.353230 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.353241 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.353256 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.353266 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.456843 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.456914 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.456933 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.456963 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.456983 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.499023 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/3.log" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.500180 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/2.log" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.504201 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe" exitCode=1 Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.504271 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.504364 5025 scope.go:117] "RemoveContainer" containerID="18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.505770 5025 scope.go:117] "RemoveContainer" containerID="49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe" Oct 07 08:17:50 crc kubenswrapper[5025]: E1007 08:17:50.506132 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.528402 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.550184 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.560517 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.560682 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.560704 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.560733 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.560755 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.574466 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.593952 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.608686 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.627708 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.646027 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.662245 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.664112 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.664173 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.664193 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.664222 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.664240 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.681284 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.699177 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.717116 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.737658 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"2025-10-07T08:16:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b\\\\n2025-10-07T08:16:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b to /host/opt/cni/bin/\\\\n2025-10-07T08:16:57Z [verbose] multus-daemon started\\\\n2025-10-07T08:16:57Z [verbose] Readiness Indicator file check\\\\n2025-10-07T08:17:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.758513 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.768398 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.768506 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.768528 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.768607 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.768631 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.781210 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.801411 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.817529 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42b679f5-3b2e-42bf-9dde-970af25b4297\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a35a698f6fc982aef52becc7aaf63bb0cca25bf9dab6f23bcbd51712b3dbc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.842097 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.868594 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18be63a068d62a2b2d02653ce552ec9e0b8db3cf5b0be40238b373087e88b61e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:19Z\\\",\\\"message\\\":\\\"lt network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:19Z is after 2025-08-24T17:21:41Z]\\\\nI1007 08:17:19.781116 6730 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1007 08:17:19.781864 6730 services_controller.go:434] Service openshift-console/downloads retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{downloads openshift-console bbc81ad7-5d87-40bf-82c5-a4db2311cff9 12322 0 2025-02-23 05:39:22 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:41d6e4f36bf41ab5be57dec2289f1f8807bbed4b0f642342f213a53bb3ff4d6d] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePor\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:49Z\\\",\\\"message\\\":\\\" handler 8\\\\nI1007 08:17:49.960697 7095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 08:17:49.960729 7095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 08:17:49.961099 7095 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1007 08:17:49.961167 7095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 08:17:49.961243 7095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 08:17:49.961276 7095 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1007 08:17:49.961287 7095 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 08:17:49.961312 7095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 08:17:49.961334 7095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 08:17:49.961392 7095 factory.go:656] Stopping watch factory\\\\nI1007 08:17:49.961450 7095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 08:17:49.961450 7095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 08:17:49.961456 7095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 08:17:49.961480 7095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 08:17:49.961518 7095 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 08:17:49.961651 7095 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:50Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.871452 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.871639 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.871664 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.871689 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.871707 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.914420 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:50 crc kubenswrapper[5025]: E1007 08:17:50.914666 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.974608 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.974664 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.974676 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.974695 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:50 crc kubenswrapper[5025]: I1007 08:17:50.974712 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:50Z","lastTransitionTime":"2025-10-07T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.077884 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.077949 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.077973 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.078001 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.078023 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.181219 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.181299 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.181324 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.181353 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.181376 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.284176 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.284252 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.284277 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.284309 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.284358 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.344968 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.345045 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.345070 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.345101 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.345126 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.368051 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.373795 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.373874 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.373899 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.373930 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.373953 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.393249 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.398856 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.398916 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.398931 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.398953 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.398973 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.418193 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.423972 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.424175 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.424365 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.424529 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.424741 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.448193 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.453201 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.453260 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.453281 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.453310 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.453327 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.480696 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.480931 5025 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.484157 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.484354 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.484631 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.484804 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.484988 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.510597 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/3.log" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.516904 5025 scope.go:117] "RemoveContainer" containerID="49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe" Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.517410 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.541964 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.565642 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"2025-10-07T08:16:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b\\\\n2025-10-07T08:16:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b to /host/opt/cni/bin/\\\\n2025-10-07T08:16:57Z [verbose] multus-daemon started\\\\n2025-10-07T08:16:57Z [verbose] Readiness Indicator file check\\\\n2025-10-07T08:17:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.589036 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.589095 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.589113 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.589139 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.589157 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.589635 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.614715 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.647261 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:49Z\\\",\\\"message\\\":\\\" handler 8\\\\nI1007 08:17:49.960697 7095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 08:17:49.960729 7095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 08:17:49.961099 7095 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1007 08:17:49.961167 7095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 08:17:49.961243 7095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 08:17:49.961276 7095 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1007 08:17:49.961287 7095 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 08:17:49.961312 7095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 08:17:49.961334 7095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 08:17:49.961392 7095 factory.go:656] Stopping watch factory\\\\nI1007 08:17:49.961450 7095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 08:17:49.961450 7095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 08:17:49.961456 7095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 08:17:49.961480 7095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 08:17:49.961518 7095 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 08:17:49.961651 7095 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.666584 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42b679f5-3b2e-42bf-9dde-970af25b4297\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a35a698f6fc982aef52becc7aaf63bb0cca25bf9dab6f23bcbd51712b3dbc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.682133 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.692061 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.692087 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.692098 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.692113 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.692124 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.704602 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.723662 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.746029 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.770369 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.791311 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.795783 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.795864 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.795875 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.795895 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.795909 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.811295 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.831110 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.849257 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.865861 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.882169 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.896420 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:51Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.898699 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.898736 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.898751 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.898773 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.898789 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:51Z","lastTransitionTime":"2025-10-07T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.914696 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.914764 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:51 crc kubenswrapper[5025]: I1007 08:17:51.914802 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.914997 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.915056 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:51 crc kubenswrapper[5025]: E1007 08:17:51.915325 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.002248 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.002289 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.002301 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.002321 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.002333 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.106486 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.106571 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.106597 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.106629 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.106652 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.210687 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.210750 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.210769 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.210800 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.210819 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.314527 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.314688 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.314713 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.314781 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.314806 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.417849 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.417935 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.417955 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.417987 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.418009 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.520666 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.520719 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.520736 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.520760 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.520777 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.623363 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.623412 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.623429 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.623454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.623484 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.726356 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.726407 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.726423 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.726446 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.726464 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.829978 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.830041 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.830062 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.830092 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.830114 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.914341 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:52 crc kubenswrapper[5025]: E1007 08:17:52.914473 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.933685 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.933762 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.933779 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.933819 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:52 crc kubenswrapper[5025]: I1007 08:17:52.933870 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:52Z","lastTransitionTime":"2025-10-07T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.037098 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.037139 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.037149 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.037165 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.037175 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.141230 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.141293 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.141312 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.141338 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.141356 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.244760 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.244802 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.244810 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.244826 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.244835 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.347715 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.347757 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.347769 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.347786 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.347799 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.450875 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.450979 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.450996 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.451420 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.451476 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.554027 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.554120 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.554138 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.554163 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.554180 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.656534 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.656602 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.656614 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.656631 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.656645 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.758812 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.758883 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.758904 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.758932 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.758951 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.861804 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.861841 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.861853 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.861868 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.861881 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.914721 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:53 crc kubenswrapper[5025]: E1007 08:17:53.914936 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.915006 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:53 crc kubenswrapper[5025]: E1007 08:17:53.915215 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.915489 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:53 crc kubenswrapper[5025]: E1007 08:17:53.915652 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.949914 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:49Z\\\",\\\"message\\\":\\\" handler 8\\\\nI1007 08:17:49.960697 7095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 08:17:49.960729 7095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 08:17:49.961099 7095 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1007 08:17:49.961167 7095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 08:17:49.961243 7095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 08:17:49.961276 7095 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1007 08:17:49.961287 7095 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 08:17:49.961312 7095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 08:17:49.961334 7095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 08:17:49.961392 7095 factory.go:656] Stopping watch factory\\\\nI1007 08:17:49.961450 7095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 08:17:49.961450 7095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 08:17:49.961456 7095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 08:17:49.961480 7095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 08:17:49.961518 7095 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 08:17:49.961651 7095 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:53Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.960281 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42b679f5-3b2e-42bf-9dde-970af25b4297\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a35a698f6fc982aef52becc7aaf63bb0cca25bf9dab6f23bcbd51712b3dbc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:53Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.963980 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.964010 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.964020 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.964035 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.964047 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:53Z","lastTransitionTime":"2025-10-07T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.974753 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:53Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:53 crc kubenswrapper[5025]: I1007 08:17:53.989159 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:53Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.002236 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.018660 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.033958 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.047627 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.057625 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.066743 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.066787 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.066801 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.066822 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.066836 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.072615 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.082747 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.096635 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.108447 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.118523 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.132270 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.150026 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"2025-10-07T08:16:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b\\\\n2025-10-07T08:16:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b to /host/opt/cni/bin/\\\\n2025-10-07T08:16:57Z [verbose] multus-daemon started\\\\n2025-10-07T08:16:57Z [verbose] Readiness Indicator file check\\\\n2025-10-07T08:17:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.167620 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.169990 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.170047 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.170101 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.170127 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.170145 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.185772 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:17:54Z is after 2025-08-24T17:21:41Z" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.273642 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.273699 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.273716 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.273737 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.273753 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.377103 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.377162 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.377181 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.377205 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.377223 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.480206 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.480269 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.480286 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.480310 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.480331 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.583196 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.583266 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.583287 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.583316 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.583339 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.686116 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.686485 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.686498 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.686517 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.686527 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.789474 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.789509 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.789518 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.789535 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.789581 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.891616 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.891661 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.891671 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.891685 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.891696 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.914166 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:54 crc kubenswrapper[5025]: E1007 08:17:54.914325 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.994311 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.994342 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.994350 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.994363 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:54 crc kubenswrapper[5025]: I1007 08:17:54.994372 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:54Z","lastTransitionTime":"2025-10-07T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.097735 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.097792 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.097813 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.097837 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.097855 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:55Z","lastTransitionTime":"2025-10-07T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.200043 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.200098 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.200117 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.200141 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.200158 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:55Z","lastTransitionTime":"2025-10-07T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.302176 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.302246 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.302262 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.302288 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.302304 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:55Z","lastTransitionTime":"2025-10-07T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.405430 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.405493 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.405506 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.405523 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.405561 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:55Z","lastTransitionTime":"2025-10-07T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.508985 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.509019 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.509030 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.509047 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.509060 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:55Z","lastTransitionTime":"2025-10-07T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.611788 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.611845 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.611857 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.611877 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.611890 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:55Z","lastTransitionTime":"2025-10-07T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.714695 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.714756 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.714772 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.714796 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.714814 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:55Z","lastTransitionTime":"2025-10-07T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.817770 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.817826 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.817842 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.817868 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.817889 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:55Z","lastTransitionTime":"2025-10-07T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.914361 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.914500 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.914387 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:55 crc kubenswrapper[5025]: E1007 08:17:55.914628 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:55 crc kubenswrapper[5025]: E1007 08:17:55.914717 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:55 crc kubenswrapper[5025]: E1007 08:17:55.914820 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.920325 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.920367 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.920384 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.920406 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:55 crc kubenswrapper[5025]: I1007 08:17:55.920424 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:55Z","lastTransitionTime":"2025-10-07T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.023995 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.024055 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.024072 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.024100 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.024118 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.128606 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.128683 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.128703 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.128732 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.128763 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.232751 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.232827 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.232845 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.232869 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.232887 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.335420 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.335480 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.335500 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.335526 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.335577 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.438199 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.438242 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.438254 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.438270 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.438281 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.540190 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.540213 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.540221 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.540235 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.540243 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.643071 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.643136 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.643159 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.643191 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.643210 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.761898 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.761967 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.761984 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.762006 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.762024 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.865143 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.865374 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.865581 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.865758 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.865921 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.914142 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:56 crc kubenswrapper[5025]: E1007 08:17:56.914323 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.968941 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.968990 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.969002 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.969017 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:56 crc kubenswrapper[5025]: I1007 08:17:56.969030 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:56Z","lastTransitionTime":"2025-10-07T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.072227 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.072309 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.072333 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.072366 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.072390 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:57Z","lastTransitionTime":"2025-10-07T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.175673 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.175739 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.175757 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.175779 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.175796 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:57Z","lastTransitionTime":"2025-10-07T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.278928 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.278969 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.278978 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.278993 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.279007 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:57Z","lastTransitionTime":"2025-10-07T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.382219 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.382288 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.382311 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.382341 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.382362 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:57Z","lastTransitionTime":"2025-10-07T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.485204 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.485266 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.485289 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.485320 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.485340 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:57Z","lastTransitionTime":"2025-10-07T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.588455 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.588514 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.588531 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.588589 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.588606 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:57Z","lastTransitionTime":"2025-10-07T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.692070 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.692138 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.692157 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.692181 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.692201 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:57Z","lastTransitionTime":"2025-10-07T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.751319 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.751445 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.751528 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.751500348 +0000 UTC m=+148.560814522 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.751584 5025 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.751695 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.751671714 +0000 UTC m=+148.560985868 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.751799 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.751932 5025 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.751976 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.751966284 +0000 UTC m=+148.561280448 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.794466 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.794525 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.794578 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.794611 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.794683 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:57Z","lastTransitionTime":"2025-10-07T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.853033 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.853097 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.853247 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.853272 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.853289 5025 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.853316 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.853358 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.853339406 +0000 UTC m=+148.662653560 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.853360 5025 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.853390 5025 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.853462 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.85343697 +0000 UTC m=+148.662751154 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.897780 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.897831 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.897850 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.897872 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.897890 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:57Z","lastTransitionTime":"2025-10-07T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.914607 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.914705 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.914743 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.914789 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.914951 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:57 crc kubenswrapper[5025]: E1007 08:17:57.915057 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:57 crc kubenswrapper[5025]: I1007 08:17:57.934370 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.001151 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.001220 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.001244 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.001274 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.001297 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.104371 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.104416 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.104431 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.104454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.104470 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.207792 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.207851 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.207868 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.207891 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.207909 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.310833 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.310904 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.310924 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.310948 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.310968 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.413528 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.413712 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.413734 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.413797 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.413815 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.517163 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.517204 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.517215 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.517232 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.517242 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.626082 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.626137 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.626155 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.626177 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.626195 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.729068 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.729109 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.729121 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.729138 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.729151 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.831870 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.831928 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.831945 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.831967 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.831984 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.913862 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:17:58 crc kubenswrapper[5025]: E1007 08:17:58.914039 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.934653 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.934712 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.934728 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.934751 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:58 crc kubenswrapper[5025]: I1007 08:17:58.934772 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:58Z","lastTransitionTime":"2025-10-07T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.037669 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.038041 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.038051 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.038066 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.038079 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.141290 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.141331 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.141341 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.141357 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.141367 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.243336 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.243382 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.243394 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.243409 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.243419 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.346096 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.346160 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.346173 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.346189 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.346199 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.449421 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.449448 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.449458 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.449473 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.449482 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.551713 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.551745 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.551754 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.551768 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.551777 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.654082 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.654153 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.654169 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.654190 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.654204 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.756304 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.756350 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.756394 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.756438 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.756455 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.859455 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.859497 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.859506 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.859521 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.859534 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.914230 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.914312 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.914256 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:17:59 crc kubenswrapper[5025]: E1007 08:17:59.914435 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:17:59 crc kubenswrapper[5025]: E1007 08:17:59.914650 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:17:59 crc kubenswrapper[5025]: E1007 08:17:59.914825 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.962926 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.962977 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.963031 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.963053 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:17:59 crc kubenswrapper[5025]: I1007 08:17:59.963070 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:17:59Z","lastTransitionTime":"2025-10-07T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.066137 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.066206 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.066227 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.066251 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.066268 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.169259 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.169301 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.169310 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.169324 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.169333 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.272366 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.272430 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.272440 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.272454 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.272463 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.375385 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.375451 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.375473 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.375502 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.375522 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.478433 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.478856 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.479001 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.479146 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.479270 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.582872 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.582941 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.582967 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.582995 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.583018 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.685586 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.685871 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.685945 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.686023 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.686097 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.788953 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.789511 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.789661 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.789756 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.789842 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.892305 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.892563 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.892643 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.892730 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.892807 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.914106 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:00 crc kubenswrapper[5025]: E1007 08:18:00.914272 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.995619 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.995968 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.996076 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.996153 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:00 crc kubenswrapper[5025]: I1007 08:18:00.996215 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:00Z","lastTransitionTime":"2025-10-07T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.099082 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.099117 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.099126 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.099140 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.099149 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.201908 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.201947 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.201955 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.201970 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.201981 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.304831 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.304902 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.304921 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.304945 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.304962 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.407845 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.407889 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.407900 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.407916 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.407930 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.510489 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.510535 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.510561 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.510581 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.510592 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.612750 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.612811 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.612827 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.613212 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.613241 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.716184 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.716257 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.716324 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.716349 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.716905 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.820028 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.820095 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.820119 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.820149 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.820173 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.842400 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.842445 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.842461 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.842483 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.842499 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: E1007 08:18:01.857053 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.861320 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.861348 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.861358 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.861374 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.861388 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: E1007 08:18:01.878637 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.882956 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.883001 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.883012 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.883030 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.883046 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: E1007 08:18:01.895643 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.899732 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.899765 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.899773 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.899786 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.899798 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: E1007 08:18:01.911507 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.913573 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.913645 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:01 crc kubenswrapper[5025]: E1007 08:18:01.913673 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.913739 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:01 crc kubenswrapper[5025]: E1007 08:18:01.913775 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:01 crc kubenswrapper[5025]: E1007 08:18:01.913815 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.915426 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.915452 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.915461 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.915474 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.915483 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:01 crc kubenswrapper[5025]: E1007 08:18:01.927367 5025 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T08:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"15feb688-c1d9-4e7b-b633-9a128b7afc98\\\",\\\"systemUUID\\\":\\\"18315730-39a6-4b53-82b9-587e1e3a7adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:01Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:01 crc kubenswrapper[5025]: E1007 08:18:01.927480 5025 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.929055 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.929088 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.929099 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.929114 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:01 crc kubenswrapper[5025]: I1007 08:18:01.929124 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:01Z","lastTransitionTime":"2025-10-07T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.030927 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.030966 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.030977 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.030991 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.031002 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.133383 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.133418 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.133427 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.133443 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.133453 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.236001 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.236039 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.236048 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.236061 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.236079 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.339040 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.339088 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.339107 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.339131 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.339147 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.441633 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.442365 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.442455 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.442500 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.442524 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.545985 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.546066 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.546094 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.546124 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.546149 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.648621 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.648690 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.648711 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.648739 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.648760 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.751531 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.751573 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.751581 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.751595 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.751605 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.853582 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.853645 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.853666 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.853695 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.853720 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.913503 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:02 crc kubenswrapper[5025]: E1007 08:18:02.913757 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.955872 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.955912 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.955921 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.955934 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:02 crc kubenswrapper[5025]: I1007 08:18:02.955942 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:02Z","lastTransitionTime":"2025-10-07T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.063020 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.063126 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.063149 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.063180 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.063201 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.166117 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.166180 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.166194 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.166220 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.166235 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.269062 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.269148 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.269159 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.269179 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.269192 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.372310 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.372392 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.372400 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.372432 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.372444 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.475882 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.475934 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.475945 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.475957 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.475965 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.578106 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.578159 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.578169 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.578182 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.578191 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.681361 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.681463 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.681491 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.681612 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.681640 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.784177 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.784255 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.784279 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.784303 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.784319 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.886849 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.886910 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.886931 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.886961 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.886983 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.914620 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.914670 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.914718 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:03 crc kubenswrapper[5025]: E1007 08:18:03.914847 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:03 crc kubenswrapper[5025]: E1007 08:18:03.914954 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:03 crc kubenswrapper[5025]: E1007 08:18:03.915051 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.938244 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b0ce981-98f4-4e27-9f4d-f9905b78ec9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab742e4ca12e3a7400e7592d95be4f82046c78d1056d6e1f06d261704286f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8dd37c41514ef90d10efe3373eaca405ab943b5f02ad1424401bf3e41f79af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://147c4016694bc2349dff82262df83bed42950e393eec9a10b1bab4a7b7b6f18c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ff71378d2db74c544393a73252b6ce86950fe8bb0f7c3e472d4011c9259495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f66d58d2e7ec30eb23f53311462ac8619f9ca0fd73fe24c8c89159326d5a6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4953481d1a9992aca70b27531de2ceda6229672aa82b62e556d13ecd59a92f31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fff0f5b081b5a61c30757d6065a06224f7edcd0adfdab4ca6abfc6ea207a660\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:17:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnm4d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l2k8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.961508 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90a24131-77fe-4045-9cea-eebd7e122243\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efea2132bc68971348829ebb4222793ae3ccf57d5eeee316aa81292cbb051594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd1c01fb63c054d4e9e2b12ab9d335d5c6a79c3c62c96bcf3bca6a5f9a4b750\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ce5e9618132944289f6a22c7aeaee88704f6352fe3461c55de21681f406662\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://167a11cb83ad658e06fbfebaa8e1675cec48026b094037f552cb4b86f3fe364e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b579d694e2e51fb1d49b8d690d572518d7fcd3476d7e6941323b70987b8edfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1007 08:16:47.574279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 08:16:47.577103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2852886688/tls.crt::/tmp/serving-cert-2852886688/tls.key\\\\\\\"\\\\nI1007 08:16:53.183291 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 08:16:53.189611 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 08:16:53.189643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 08:16:53.189670 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 08:16:53.189676 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 08:16:53.204072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 08:16:53.204130 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204143 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 08:16:53.204157 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 08:16:53.204166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 08:16:53.204173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 08:16:53.204182 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 08:16:53.204715 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 08:16:53.207373 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52afc5d765ed4544c55e6e522b190cb1c6788fc270ef55c8242f42b63c4d2702\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ffc828d4a5f589ded64b4ad251fce297e58dd224b8cdb5927e39370c816fd40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.981893 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.989478 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.989524 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.989562 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.989583 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.989597 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:03Z","lastTransitionTime":"2025-10-07T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:03 crc kubenswrapper[5025]: I1007 08:18:03.999256 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e04a2eadbde2c6c8df1913f15a1efbebef80b3e34c256f7ff9249479f19eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:03Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.014415 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61571dd78435a0514b8674028c7c467678e6ee4d1c6b334a98c6d1ddbf4871d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.028205 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hc88w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2744919e-82fb-4c3c-8776-c2c9c44af6e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa525d43c3e5aafdbdf20fdac7eb16579649323ad50ae564f0266fb2e69ffd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swvcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hc88w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.045700 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4849c41-22e1-400e-8e11-096da49ef1b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc23369548789187b7057cd9780703dbc892dcf8aaaac969f6049a15ab0656e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pbth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2dj2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.062111 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jf557" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea42823f-6b19-439f-a280-80e5b1a816c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6ec573b8a9fb1fd6ab7a370488b1b68024a033b0cfc728cf7e8288aacb4c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqrdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jf557\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.075997 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70c47a-d783-4a9f-8e94-5fc68eda69fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1540ed9540ce18c4ad3c5c7d82a3916528d100cc9d7ec1ec5758511f03e6f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2f25f9baf4b018c73a5655ca3e595d123cd0ffaae9060ea0a34148711b1be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ckns7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w8khj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.092436 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.092475 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.092484 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.092498 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.092509 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:04Z","lastTransitionTime":"2025-10-07T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.106783 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81bec5eb-5d25-48ec-a17a-31d0c3e4778d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1001623b4801160d726efa289ef774cfd923daa67eb0bb261e40cfde1064cda7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c65382b2ab640a58d09f36e6579a7e94bae0b9f16a675d66740226aaf2cc31b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://850f4333eed8df71ad0a544dddc6f27c300e530b4bce84dbba7287280d4bde4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af26657a576dc304410891b4adb5fb9b9e1fa162746eb718e94041fd3708e1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc6bc80726e8911355757ec683507663b96ba7a2d4f5c5f57d9f7b3292dd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5097cc8364707dbab8b1e64c98e38f6ca4d2ca6b09bc40b340c95a78d575d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5097cc8364707dbab8b1e64c98e38f6ca4d2ca6b09bc40b340c95a78d575d107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee92939de135b9c7694cfc5a131e2be7a644b4388051f786fef00c10a285b317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee92939de135b9c7694cfc5a131e2be7a644b4388051f786fef00c10a285b317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://660f67e0730fe6a0c6a2639e066ce18a9e221ba70404893763d0e60faf147b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://660f67e0730fe6a0c6a2639e066ce18a9e221ba70404893763d0e60faf147b10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.123624 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffe327a6-74bb-4da3-b864-93cc954bde0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd9a4d40a19f7571919228601897e666db895791868d65b27e90588add3daee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://919574d16a08760217aa219e673b02a059d96379aed81ad6248fb0623f672d06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd4a8d9d14761162acc1940fccd21269820427fce37c520cd9e5d6c0b7279e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d45b57ffbb81dd048d051a94a5c7371321a120b93c8b2b47643ff33acb73a79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.137997 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pfw77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:17:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f4ls7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.153472 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.169079 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34b07a69-1bbf-4019-b824-7b5be0f9404d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:42Z\\\",\\\"message\\\":\\\"2025-10-07T08:16:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b\\\\n2025-10-07T08:16:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e58277b-eefe-4835-9028-55848507956b to /host/opt/cni/bin/\\\\n2025-10-07T08:16:57Z [verbose] multus-daemon started\\\\n2025-10-07T08:16:57Z [verbose] Readiness Indicator file check\\\\n2025-10-07T08:17:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gx7qx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.182720 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2383c50-e82f-4445-8d28-ec90e8d95c47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f31fa57b4eaaccec309a01396ec164dc6bda3e12a1086eda7a2d0eb3233eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6e7353bb02996d8525befd61dad46a459fca7713e536cec9b1fb9f47d99d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceb048af5928a4b50cec5e93738471dd356f20cf57363fe085df0bb4216100f3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://848397bbf66b4edb13083c81defed30b9ddaff1a1636e2f1fe09b25c8ff99d96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.194836 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.194900 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.194918 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.194943 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.194974 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:04Z","lastTransitionTime":"2025-10-07T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.200206 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.228286 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T08:17:49Z\\\",\\\"message\\\":\\\" handler 8\\\\nI1007 08:17:49.960697 7095 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1007 08:17:49.960729 7095 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1007 08:17:49.961099 7095 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1007 08:17:49.961167 7095 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 08:17:49.961243 7095 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 08:17:49.961276 7095 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1007 08:17:49.961287 7095 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 08:17:49.961312 7095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 08:17:49.961334 7095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 08:17:49.961392 7095 factory.go:656] Stopping watch factory\\\\nI1007 08:17:49.961450 7095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 08:17:49.961450 7095 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 08:17:49.961456 7095 ovnkube.go:599] Stopped ovnkube\\\\nI1007 08:17:49.961480 7095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 08:17:49.961518 7095 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 08:17:49.961651 7095 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T08:17:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpm25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dwm22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.243259 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42b679f5-3b2e-42bf-9dde-970af25b4297\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a35a698f6fc982aef52becc7aaf63bb0cca25bf9dab6f23bcbd51712b3dbc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e520023c7c07f8525b1ac012f6ac7e385413ecd13b4db61a52862dfe77349140\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T08:16:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T08:16:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T08:16:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.260645 5025 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T08:16:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://473d5cdd405367504396c404f60bbdecda93dbe6edbc8ddbfc18bbcc2fdc4d2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f423a0c357e0279ca41d112d6091586ed91d3826f4ae250ec2aa4973554968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T08:16:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T08:18:04Z is after 2025-08-24T17:21:41Z" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.298750 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.298803 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.298821 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.298844 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.298861 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:04Z","lastTransitionTime":"2025-10-07T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.401286 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.401315 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.401325 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.401339 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.401350 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:04Z","lastTransitionTime":"2025-10-07T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.504409 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.504461 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.504478 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.504503 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.504522 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:04Z","lastTransitionTime":"2025-10-07T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.607723 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.607786 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.607805 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.607829 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.607848 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:04Z","lastTransitionTime":"2025-10-07T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.711407 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.711445 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.711455 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.711470 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.711480 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:04Z","lastTransitionTime":"2025-10-07T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.815011 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.815070 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.815087 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.815112 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.815131 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:04Z","lastTransitionTime":"2025-10-07T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.914504 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:04 crc kubenswrapper[5025]: E1007 08:18:04.914885 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.917215 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.917241 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.917249 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.917261 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:04 crc kubenswrapper[5025]: I1007 08:18:04.917272 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:04Z","lastTransitionTime":"2025-10-07T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.021039 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.021095 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.021106 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.021126 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.021136 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.123442 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.123478 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.123489 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.123503 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.123515 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.226458 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.226491 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.226501 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.226517 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.226528 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.329682 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.329754 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.329774 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.329798 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.329816 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.432156 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.432192 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.432203 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.432217 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.432227 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.534632 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.534689 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.534701 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.534718 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.534729 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.637991 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.638479 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.638510 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.638586 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.638607 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.742916 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.742971 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.742996 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.743028 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.743048 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.846376 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.846430 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.846447 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.846477 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.846498 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.913864 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.914103 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:05 crc kubenswrapper[5025]: E1007 08:18:05.914310 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.914377 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:05 crc kubenswrapper[5025]: E1007 08:18:05.915179 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:05 crc kubenswrapper[5025]: E1007 08:18:05.915367 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.916110 5025 scope.go:117] "RemoveContainer" containerID="49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe" Oct 07 08:18:05 crc kubenswrapper[5025]: E1007 08:18:05.916583 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.950419 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.950492 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.950511 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.950565 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:05 crc kubenswrapper[5025]: I1007 08:18:05.950593 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:05Z","lastTransitionTime":"2025-10-07T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.054052 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.054137 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.054162 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.054199 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.054227 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.157470 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.157513 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.157524 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.157540 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.157570 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.261030 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.261072 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.261083 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.261100 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.261111 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.364253 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.364382 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.364407 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.364477 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.364505 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.467946 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.468058 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.468078 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.468110 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.468134 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.570636 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.570698 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.570717 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.570740 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.570755 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.674327 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.674373 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.674383 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.674400 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.674414 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.778394 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.778471 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.778491 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.778520 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.778567 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.882060 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.882114 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.882129 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.882151 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.882167 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.914169 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:06 crc kubenswrapper[5025]: E1007 08:18:06.914439 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.984656 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.984702 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.984715 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.984732 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:06 crc kubenswrapper[5025]: I1007 08:18:06.984743 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:06Z","lastTransitionTime":"2025-10-07T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.088336 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.088436 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.088466 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.088506 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.088535 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:07Z","lastTransitionTime":"2025-10-07T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.191658 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.191720 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.191740 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.191764 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.191783 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:07Z","lastTransitionTime":"2025-10-07T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.295127 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.295584 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.295615 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.295646 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.295671 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:07Z","lastTransitionTime":"2025-10-07T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.399028 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.399101 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.399125 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.399156 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.399179 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:07Z","lastTransitionTime":"2025-10-07T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.502039 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.502112 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.502139 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.502171 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.502193 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:07Z","lastTransitionTime":"2025-10-07T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.605946 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.606028 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.606054 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.606083 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.606101 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:07Z","lastTransitionTime":"2025-10-07T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.710035 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.710116 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.710136 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.710161 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.710178 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:07Z","lastTransitionTime":"2025-10-07T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.813241 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.813305 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.813317 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.813335 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.813347 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:07Z","lastTransitionTime":"2025-10-07T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.913983 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.914074 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.913983 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:07 crc kubenswrapper[5025]: E1007 08:18:07.914397 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:07 crc kubenswrapper[5025]: E1007 08:18:07.914777 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:07 crc kubenswrapper[5025]: E1007 08:18:07.914944 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.916659 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.916726 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.916744 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.916770 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:07 crc kubenswrapper[5025]: I1007 08:18:07.916791 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:07Z","lastTransitionTime":"2025-10-07T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.020429 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.020479 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.020492 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.020510 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.020521 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.123304 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.123348 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.123359 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.123374 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.123384 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.226822 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.226867 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.226875 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.226891 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.226901 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.329618 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.329658 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.329667 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.329682 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.329694 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.432451 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.432518 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.432594 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.432625 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.432645 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.535886 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.535927 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.535939 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.535953 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.535962 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.639463 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.639521 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.639536 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.639585 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.639596 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.742291 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.742326 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.742336 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.742349 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.742359 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.844800 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.844890 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.844909 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.844936 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.844986 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.913746 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:08 crc kubenswrapper[5025]: E1007 08:18:08.913891 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.948404 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.948464 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.948478 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.948497 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:08 crc kubenswrapper[5025]: I1007 08:18:08.948507 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:08Z","lastTransitionTime":"2025-10-07T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.051589 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.051636 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.051649 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.051668 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.051681 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.153608 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.153661 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.153678 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.153712 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.153730 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.256488 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.256567 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.256584 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.256609 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.256627 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.359438 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.359501 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.359518 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.359573 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.359590 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.462017 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.462103 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.462124 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.462146 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.462159 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.565777 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.565860 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.565889 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.565923 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.565941 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.669453 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.669511 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.669529 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.669584 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.669602 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.772309 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.772364 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.772380 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.772461 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.772485 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.876047 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.876100 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.876112 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.876130 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.876146 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.913937 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.914041 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:09 crc kubenswrapper[5025]: E1007 08:18:09.914084 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:09 crc kubenswrapper[5025]: E1007 08:18:09.914174 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.914215 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:09 crc kubenswrapper[5025]: E1007 08:18:09.914454 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.977896 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.977937 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.977949 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.977968 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:09 crc kubenswrapper[5025]: I1007 08:18:09.977981 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:09Z","lastTransitionTime":"2025-10-07T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.079973 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.080032 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.080050 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.080073 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.080090 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:10Z","lastTransitionTime":"2025-10-07T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.182057 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.182091 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.182103 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.182121 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.182132 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:10Z","lastTransitionTime":"2025-10-07T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.285076 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.285137 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.285150 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.285167 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.285179 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:10Z","lastTransitionTime":"2025-10-07T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.388152 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.388212 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.388237 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.388261 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.388279 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:10Z","lastTransitionTime":"2025-10-07T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.490861 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.490932 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.490944 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.490977 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.490990 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:10Z","lastTransitionTime":"2025-10-07T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.593383 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.593432 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.593442 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.593456 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.593469 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:10Z","lastTransitionTime":"2025-10-07T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.696663 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.696760 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.696827 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.696862 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.696885 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:10Z","lastTransitionTime":"2025-10-07T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.800665 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.800733 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.800750 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.800776 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.800793 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:10Z","lastTransitionTime":"2025-10-07T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.903521 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.903581 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.903593 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.903608 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.903620 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:10Z","lastTransitionTime":"2025-10-07T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:10 crc kubenswrapper[5025]: I1007 08:18:10.914144 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:10 crc kubenswrapper[5025]: E1007 08:18:10.914309 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.005831 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.005878 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.005888 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.005906 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.005917 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.108340 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.108384 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.108406 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.108420 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.108429 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.211407 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.211488 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.211530 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.211603 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.211626 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.314988 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.315029 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.315039 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.315054 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.315064 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.417623 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.417710 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.417724 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.417741 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.417753 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.520657 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.520699 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.520707 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.520721 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.520730 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.623564 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.623613 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.623624 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.623641 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.623654 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.726720 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.726762 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.726776 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.726793 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.726805 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.828962 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.829000 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.829011 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.829027 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.829038 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.914599 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.914670 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.914725 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:11 crc kubenswrapper[5025]: E1007 08:18:11.914820 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:11 crc kubenswrapper[5025]: E1007 08:18:11.914938 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:11 crc kubenswrapper[5025]: E1007 08:18:11.915169 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.931588 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.931665 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.931677 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.931699 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:11 crc kubenswrapper[5025]: I1007 08:18:11.931710 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:11Z","lastTransitionTime":"2025-10-07T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.035111 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.035458 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.035521 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.035589 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.035607 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:12Z","lastTransitionTime":"2025-10-07T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.139167 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.139241 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.139267 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.139302 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.139329 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:12Z","lastTransitionTime":"2025-10-07T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.241610 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.241681 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.241704 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.241732 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.241754 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:12Z","lastTransitionTime":"2025-10-07T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.254374 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.254462 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.254481 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.254504 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.254522 5025 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T08:18:12Z","lastTransitionTime":"2025-10-07T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.315755 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645"] Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.316136 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.318439 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.320660 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.320695 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.321131 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.362342 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.362313434 podStartE2EDuration="1m18.362313434s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.340677227 +0000 UTC m=+99.149991421" watchObservedRunningTime="2025-10-07 08:18:12.362313434 +0000 UTC m=+99.171627588" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.381795 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l2k8t" podStartSLOduration=78.3817691 podStartE2EDuration="1m18.3817691s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.381618746 +0000 UTC m=+99.190932930" watchObservedRunningTime="2025-10-07 08:18:12.3817691 +0000 UTC m=+99.191083254" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.406113 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hc88w" podStartSLOduration=78.406088463 podStartE2EDuration="1m18.406088463s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.405478494 +0000 UTC m=+99.214792658" watchObservedRunningTime="2025-10-07 08:18:12.406088463 +0000 UTC m=+99.215402617" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.408040 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.408090 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.408112 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.408152 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.408180 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.417221 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podStartSLOduration=78.417200341 podStartE2EDuration="1m18.417200341s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.416987514 +0000 UTC m=+99.226301678" watchObservedRunningTime="2025-10-07 08:18:12.417200341 +0000 UTC m=+99.226514505" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.427749 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jf557" podStartSLOduration=78.427716939 podStartE2EDuration="1m18.427716939s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.426905113 +0000 UTC m=+99.236219277" watchObservedRunningTime="2025-10-07 08:18:12.427716939 +0000 UTC m=+99.237031093" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.468735 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w8khj" podStartSLOduration=78.468714219 podStartE2EDuration="1m18.468714219s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.440763569 +0000 UTC m=+99.250077713" watchObservedRunningTime="2025-10-07 08:18:12.468714219 +0000 UTC m=+99.278028363" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.483919 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.483903148 podStartE2EDuration="51.483903148s" podCreationTimestamp="2025-10-07 08:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.482886085 +0000 UTC m=+99.292200229" watchObservedRunningTime="2025-10-07 08:18:12.483903148 +0000 UTC m=+99.293217292" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.484022 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=15.484016881 podStartE2EDuration="15.484016881s" podCreationTimestamp="2025-10-07 08:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.46906596 +0000 UTC m=+99.278380124" watchObservedRunningTime="2025-10-07 08:18:12.484016881 +0000 UTC m=+99.293331025" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.508949 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.509010 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.509035 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.509080 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.509108 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.509160 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.509185 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.510014 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.516239 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.524240 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b84ad8fb-cdc3-4b51-9782-fe0e03b2173d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zs645\" (UID: \"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.524346 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xmhw6" podStartSLOduration=78.524331238 podStartE2EDuration="1m18.524331238s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.523812122 +0000 UTC m=+99.333126266" watchObservedRunningTime="2025-10-07 08:18:12.524331238 +0000 UTC m=+99.333645382" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.540706 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.540685905 podStartE2EDuration="1m13.540685905s" podCreationTimestamp="2025-10-07 08:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.539980372 +0000 UTC m=+99.349294516" watchObservedRunningTime="2025-10-07 08:18:12.540685905 +0000 UTC m=+99.350000049" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.595981 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=32.595961554 podStartE2EDuration="32.595961554s" podCreationTimestamp="2025-10-07 08:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:12.583858395 +0000 UTC m=+99.393172539" watchObservedRunningTime="2025-10-07 08:18:12.595961554 +0000 UTC m=+99.405275698" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.631295 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" Oct 07 08:18:12 crc kubenswrapper[5025]: I1007 08:18:12.913720 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:12 crc kubenswrapper[5025]: E1007 08:18:12.914321 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:13 crc kubenswrapper[5025]: I1007 08:18:13.014385 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:13 crc kubenswrapper[5025]: E1007 08:18:13.014571 5025 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:18:13 crc kubenswrapper[5025]: E1007 08:18:13.014659 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs podName:93fdeab4-b5d2-42d8-97ca-d5d61032e19f nodeName:}" failed. No retries permitted until 2025-10-07 08:19:17.014637389 +0000 UTC m=+163.823951603 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs") pod "network-metrics-daemon-f4ls7" (UID: "93fdeab4-b5d2-42d8-97ca-d5d61032e19f") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 08:18:13 crc kubenswrapper[5025]: I1007 08:18:13.588117 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" event={"ID":"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d","Type":"ContainerStarted","Data":"4c31c22da00edb6b7835594ba2cc277db70d6f114554b5570d1bde6ebba953b8"} Oct 07 08:18:13 crc kubenswrapper[5025]: I1007 08:18:13.588161 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" event={"ID":"b84ad8fb-cdc3-4b51-9782-fe0e03b2173d","Type":"ContainerStarted","Data":"8f7070752b3b8e798310fad4c564621eb681c94018574414103d46f33bfc5430"} Oct 07 08:18:13 crc kubenswrapper[5025]: I1007 08:18:13.601569 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zs645" podStartSLOduration=79.601519457 podStartE2EDuration="1m19.601519457s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:13.600973099 +0000 UTC m=+100.410287243" watchObservedRunningTime="2025-10-07 08:18:13.601519457 +0000 UTC m=+100.410833611" Oct 07 08:18:13 crc kubenswrapper[5025]: I1007 08:18:13.913976 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:13 crc kubenswrapper[5025]: I1007 08:18:13.913976 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:13 crc kubenswrapper[5025]: E1007 08:18:13.915173 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:13 crc kubenswrapper[5025]: I1007 08:18:13.915190 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:13 crc kubenswrapper[5025]: E1007 08:18:13.915273 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:13 crc kubenswrapper[5025]: E1007 08:18:13.915293 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:14 crc kubenswrapper[5025]: I1007 08:18:14.913786 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:14 crc kubenswrapper[5025]: E1007 08:18:14.914883 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:15 crc kubenswrapper[5025]: I1007 08:18:15.913596 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:15 crc kubenswrapper[5025]: E1007 08:18:15.913766 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:15 crc kubenswrapper[5025]: I1007 08:18:15.914077 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:15 crc kubenswrapper[5025]: E1007 08:18:15.914166 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:15 crc kubenswrapper[5025]: I1007 08:18:15.914382 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:15 crc kubenswrapper[5025]: E1007 08:18:15.914463 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:16 crc kubenswrapper[5025]: I1007 08:18:16.913996 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:16 crc kubenswrapper[5025]: E1007 08:18:16.915101 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:17 crc kubenswrapper[5025]: I1007 08:18:17.913610 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:17 crc kubenswrapper[5025]: I1007 08:18:17.913723 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:17 crc kubenswrapper[5025]: E1007 08:18:17.913785 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:17 crc kubenswrapper[5025]: I1007 08:18:17.913740 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:17 crc kubenswrapper[5025]: E1007 08:18:17.913903 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:17 crc kubenswrapper[5025]: E1007 08:18:17.914062 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:17 crc kubenswrapper[5025]: I1007 08:18:17.914718 5025 scope.go:117] "RemoveContainer" containerID="49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe" Oct 07 08:18:17 crc kubenswrapper[5025]: E1007 08:18:17.914854 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:18:18 crc kubenswrapper[5025]: I1007 08:18:18.913915 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:18 crc kubenswrapper[5025]: E1007 08:18:18.914060 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:19 crc kubenswrapper[5025]: I1007 08:18:19.914339 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:19 crc kubenswrapper[5025]: I1007 08:18:19.914391 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:19 crc kubenswrapper[5025]: I1007 08:18:19.914436 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:19 crc kubenswrapper[5025]: E1007 08:18:19.914529 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:19 crc kubenswrapper[5025]: E1007 08:18:19.914652 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:19 crc kubenswrapper[5025]: E1007 08:18:19.914738 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:20 crc kubenswrapper[5025]: I1007 08:18:20.914772 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:20 crc kubenswrapper[5025]: E1007 08:18:20.914992 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:21 crc kubenswrapper[5025]: I1007 08:18:21.914735 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:21 crc kubenswrapper[5025]: I1007 08:18:21.914853 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:21 crc kubenswrapper[5025]: E1007 08:18:21.914917 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:21 crc kubenswrapper[5025]: E1007 08:18:21.915059 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:21 crc kubenswrapper[5025]: I1007 08:18:21.915107 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:21 crc kubenswrapper[5025]: E1007 08:18:21.915265 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:22 crc kubenswrapper[5025]: I1007 08:18:22.914339 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:22 crc kubenswrapper[5025]: E1007 08:18:22.914468 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:23 crc kubenswrapper[5025]: I1007 08:18:23.914057 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:23 crc kubenswrapper[5025]: I1007 08:18:23.914115 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:23 crc kubenswrapper[5025]: I1007 08:18:23.914061 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:23 crc kubenswrapper[5025]: E1007 08:18:23.916593 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:23 crc kubenswrapper[5025]: E1007 08:18:23.916640 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:23 crc kubenswrapper[5025]: E1007 08:18:23.916715 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:24 crc kubenswrapper[5025]: I1007 08:18:24.913632 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:24 crc kubenswrapper[5025]: E1007 08:18:24.913812 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:25 crc kubenswrapper[5025]: I1007 08:18:25.914077 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:25 crc kubenswrapper[5025]: I1007 08:18:25.914161 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:25 crc kubenswrapper[5025]: E1007 08:18:25.914246 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:25 crc kubenswrapper[5025]: I1007 08:18:25.914350 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:25 crc kubenswrapper[5025]: E1007 08:18:25.914467 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:25 crc kubenswrapper[5025]: E1007 08:18:25.914600 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:26 crc kubenswrapper[5025]: I1007 08:18:26.914630 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:26 crc kubenswrapper[5025]: E1007 08:18:26.914765 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:27 crc kubenswrapper[5025]: I1007 08:18:27.913754 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:27 crc kubenswrapper[5025]: I1007 08:18:27.913838 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:27 crc kubenswrapper[5025]: E1007 08:18:27.913894 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:27 crc kubenswrapper[5025]: I1007 08:18:27.913841 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:27 crc kubenswrapper[5025]: E1007 08:18:27.914062 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:27 crc kubenswrapper[5025]: E1007 08:18:27.914159 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:28 crc kubenswrapper[5025]: I1007 08:18:28.642771 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/1.log" Oct 07 08:18:28 crc kubenswrapper[5025]: I1007 08:18:28.643301 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/0.log" Oct 07 08:18:28 crc kubenswrapper[5025]: I1007 08:18:28.643371 5025 generic.go:334] "Generic (PLEG): container finished" podID="34b07a69-1bbf-4019-b824-7b5be0f9404d" containerID="79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6" exitCode=1 Oct 07 08:18:28 crc kubenswrapper[5025]: I1007 08:18:28.643420 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmhw6" event={"ID":"34b07a69-1bbf-4019-b824-7b5be0f9404d","Type":"ContainerDied","Data":"79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6"} Oct 07 08:18:28 crc kubenswrapper[5025]: I1007 08:18:28.643474 5025 scope.go:117] "RemoveContainer" containerID="aee164f46abc9772ed12e1ba5e768ee44e25cc7444a58000075819f45cb4ba11" Oct 07 08:18:28 crc kubenswrapper[5025]: I1007 08:18:28.644088 5025 scope.go:117] "RemoveContainer" containerID="79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6" Oct 07 08:18:28 crc kubenswrapper[5025]: E1007 08:18:28.644337 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xmhw6_openshift-multus(34b07a69-1bbf-4019-b824-7b5be0f9404d)\"" pod="openshift-multus/multus-xmhw6" podUID="34b07a69-1bbf-4019-b824-7b5be0f9404d" Oct 07 08:18:28 crc kubenswrapper[5025]: I1007 08:18:28.914249 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:28 crc kubenswrapper[5025]: E1007 08:18:28.914401 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:28 crc kubenswrapper[5025]: I1007 08:18:28.915613 5025 scope.go:117] "RemoveContainer" containerID="49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe" Oct 07 08:18:28 crc kubenswrapper[5025]: E1007 08:18:28.915912 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dwm22_openshift-ovn-kubernetes(8b6b9c75-ecfe-4815-b279-bb56f57a82a8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" Oct 07 08:18:29 crc kubenswrapper[5025]: I1007 08:18:29.649649 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/1.log" Oct 07 08:18:29 crc kubenswrapper[5025]: I1007 08:18:29.914252 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:29 crc kubenswrapper[5025]: I1007 08:18:29.914275 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:29 crc kubenswrapper[5025]: E1007 08:18:29.914465 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:29 crc kubenswrapper[5025]: I1007 08:18:29.914480 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:29 crc kubenswrapper[5025]: E1007 08:18:29.914657 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:29 crc kubenswrapper[5025]: E1007 08:18:29.914778 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:30 crc kubenswrapper[5025]: I1007 08:18:30.913664 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:30 crc kubenswrapper[5025]: E1007 08:18:30.913816 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:31 crc kubenswrapper[5025]: I1007 08:18:31.914492 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:31 crc kubenswrapper[5025]: I1007 08:18:31.914572 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:31 crc kubenswrapper[5025]: I1007 08:18:31.914614 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:31 crc kubenswrapper[5025]: E1007 08:18:31.914630 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:31 crc kubenswrapper[5025]: E1007 08:18:31.914785 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:31 crc kubenswrapper[5025]: E1007 08:18:31.914988 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:32 crc kubenswrapper[5025]: I1007 08:18:32.914401 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:32 crc kubenswrapper[5025]: E1007 08:18:32.914653 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:33 crc kubenswrapper[5025]: E1007 08:18:33.862898 5025 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 07 08:18:33 crc kubenswrapper[5025]: I1007 08:18:33.914480 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:33 crc kubenswrapper[5025]: I1007 08:18:33.914518 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:33 crc kubenswrapper[5025]: E1007 08:18:33.915565 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:33 crc kubenswrapper[5025]: I1007 08:18:33.915677 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:33 crc kubenswrapper[5025]: E1007 08:18:33.915794 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:33 crc kubenswrapper[5025]: E1007 08:18:33.915841 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:34 crc kubenswrapper[5025]: E1007 08:18:34.021276 5025 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 08:18:34 crc kubenswrapper[5025]: I1007 08:18:34.914342 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:34 crc kubenswrapper[5025]: E1007 08:18:34.914605 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:35 crc kubenswrapper[5025]: I1007 08:18:35.914154 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:35 crc kubenswrapper[5025]: I1007 08:18:35.914236 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:35 crc kubenswrapper[5025]: I1007 08:18:35.914196 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:35 crc kubenswrapper[5025]: E1007 08:18:35.914368 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:35 crc kubenswrapper[5025]: E1007 08:18:35.914533 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:35 crc kubenswrapper[5025]: E1007 08:18:35.914660 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:36 crc kubenswrapper[5025]: I1007 08:18:36.914409 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:36 crc kubenswrapper[5025]: E1007 08:18:36.915521 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:37 crc kubenswrapper[5025]: I1007 08:18:37.917100 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:37 crc kubenswrapper[5025]: I1007 08:18:37.917138 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:37 crc kubenswrapper[5025]: E1007 08:18:37.917337 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:37 crc kubenswrapper[5025]: I1007 08:18:37.917101 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:37 crc kubenswrapper[5025]: E1007 08:18:37.917583 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:37 crc kubenswrapper[5025]: E1007 08:18:37.917720 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:38 crc kubenswrapper[5025]: I1007 08:18:38.914088 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:38 crc kubenswrapper[5025]: E1007 08:18:38.914280 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:39 crc kubenswrapper[5025]: E1007 08:18:39.023715 5025 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 08:18:39 crc kubenswrapper[5025]: I1007 08:18:39.914187 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:39 crc kubenswrapper[5025]: E1007 08:18:39.914489 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:39 crc kubenswrapper[5025]: I1007 08:18:39.914722 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:39 crc kubenswrapper[5025]: I1007 08:18:39.914722 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:39 crc kubenswrapper[5025]: E1007 08:18:39.915339 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:39 crc kubenswrapper[5025]: E1007 08:18:39.915480 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:40 crc kubenswrapper[5025]: I1007 08:18:40.913680 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:40 crc kubenswrapper[5025]: E1007 08:18:40.914157 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:40 crc kubenswrapper[5025]: I1007 08:18:40.914426 5025 scope.go:117] "RemoveContainer" containerID="49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe" Oct 07 08:18:41 crc kubenswrapper[5025]: I1007 08:18:41.693629 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/3.log" Oct 07 08:18:41 crc kubenswrapper[5025]: I1007 08:18:41.696388 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerStarted","Data":"81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9"} Oct 07 08:18:41 crc kubenswrapper[5025]: I1007 08:18:41.697145 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:18:41 crc kubenswrapper[5025]: I1007 08:18:41.726592 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podStartSLOduration=107.726564959 podStartE2EDuration="1m47.726564959s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:41.726139115 +0000 UTC m=+128.535453329" watchObservedRunningTime="2025-10-07 08:18:41.726564959 +0000 UTC m=+128.535879103" Oct 07 08:18:41 crc kubenswrapper[5025]: I1007 08:18:41.811716 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f4ls7"] Oct 07 08:18:41 crc kubenswrapper[5025]: I1007 08:18:41.811872 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:41 crc kubenswrapper[5025]: E1007 08:18:41.811993 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:41 crc kubenswrapper[5025]: I1007 08:18:41.914115 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:41 crc kubenswrapper[5025]: I1007 08:18:41.914258 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:41 crc kubenswrapper[5025]: E1007 08:18:41.914325 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:41 crc kubenswrapper[5025]: I1007 08:18:41.914374 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:41 crc kubenswrapper[5025]: E1007 08:18:41.914491 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:41 crc kubenswrapper[5025]: E1007 08:18:41.914670 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:42 crc kubenswrapper[5025]: I1007 08:18:42.913711 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:42 crc kubenswrapper[5025]: E1007 08:18:42.914103 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:43 crc kubenswrapper[5025]: I1007 08:18:43.914715 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:43 crc kubenswrapper[5025]: I1007 08:18:43.917273 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:43 crc kubenswrapper[5025]: I1007 08:18:43.917342 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:43 crc kubenswrapper[5025]: E1007 08:18:43.917499 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:43 crc kubenswrapper[5025]: E1007 08:18:43.917696 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:43 crc kubenswrapper[5025]: I1007 08:18:43.918035 5025 scope.go:117] "RemoveContainer" containerID="79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6" Oct 07 08:18:43 crc kubenswrapper[5025]: E1007 08:18:43.918206 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:44 crc kubenswrapper[5025]: E1007 08:18:44.024583 5025 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 08:18:44 crc kubenswrapper[5025]: I1007 08:18:44.709145 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/1.log" Oct 07 08:18:44 crc kubenswrapper[5025]: I1007 08:18:44.709323 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmhw6" event={"ID":"34b07a69-1bbf-4019-b824-7b5be0f9404d","Type":"ContainerStarted","Data":"74509613ab9ea3b6e4feae1711cea3c8b8082ee266d8f3e819add82c25e9eec3"} Oct 07 08:18:44 crc kubenswrapper[5025]: I1007 08:18:44.914313 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:44 crc kubenswrapper[5025]: E1007 08:18:44.914587 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:45 crc kubenswrapper[5025]: I1007 08:18:45.914736 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:45 crc kubenswrapper[5025]: I1007 08:18:45.914807 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:45 crc kubenswrapper[5025]: I1007 08:18:45.914866 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:45 crc kubenswrapper[5025]: E1007 08:18:45.914974 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:45 crc kubenswrapper[5025]: E1007 08:18:45.915032 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:45 crc kubenswrapper[5025]: E1007 08:18:45.915183 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:46 crc kubenswrapper[5025]: I1007 08:18:46.914216 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:46 crc kubenswrapper[5025]: E1007 08:18:46.914357 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:47 crc kubenswrapper[5025]: I1007 08:18:47.914008 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:47 crc kubenswrapper[5025]: E1007 08:18:47.914130 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 08:18:47 crc kubenswrapper[5025]: I1007 08:18:47.914225 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:47 crc kubenswrapper[5025]: E1007 08:18:47.914388 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 08:18:47 crc kubenswrapper[5025]: I1007 08:18:47.914679 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:47 crc kubenswrapper[5025]: E1007 08:18:47.914749 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 08:18:48 crc kubenswrapper[5025]: I1007 08:18:48.914019 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:48 crc kubenswrapper[5025]: E1007 08:18:48.914701 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4ls7" podUID="93fdeab4-b5d2-42d8-97ca-d5d61032e19f" Oct 07 08:18:49 crc kubenswrapper[5025]: I1007 08:18:49.914580 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:18:49 crc kubenswrapper[5025]: I1007 08:18:49.914612 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:18:49 crc kubenswrapper[5025]: I1007 08:18:49.915008 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:18:49 crc kubenswrapper[5025]: I1007 08:18:49.917101 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 07 08:18:49 crc kubenswrapper[5025]: I1007 08:18:49.918930 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 07 08:18:49 crc kubenswrapper[5025]: I1007 08:18:49.919045 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 07 08:18:49 crc kubenswrapper[5025]: I1007 08:18:49.919045 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 07 08:18:50 crc kubenswrapper[5025]: I1007 08:18:50.913840 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:18:50 crc kubenswrapper[5025]: I1007 08:18:50.916483 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 07 08:18:50 crc kubenswrapper[5025]: I1007 08:18:50.916678 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.605801 5025 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.661179 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.661992 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.662213 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.663115 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2z5w6"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.663282 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.664032 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.664231 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.664616 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.665096 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.665956 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.666011 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cxbbt"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.666845 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpfzj"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.666915 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.667282 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.673922 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.674422 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.677067 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hmvqm"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.677730 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.692471 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.692777 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.692861 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.693386 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.693451 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.693559 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.693692 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.693823 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.694155 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.694820 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.695139 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.695324 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.697585 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.698064 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.698157 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.698782 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.698967 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.707105 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.708886 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.712768 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.713309 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.713429 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.713526 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.713603 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.713793 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.713856 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.713891 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.713970 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714024 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714038 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714114 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714206 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714308 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714389 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714421 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714454 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714480 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714437 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714613 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714134 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714659 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714720 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kpss8"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714737 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714762 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714861 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.715006 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.715279 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.715614 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.715650 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.715701 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.714391 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.715835 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.715945 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.715948 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.716701 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.717144 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h4xpz"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.717811 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.718513 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.718648 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.718822 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.718981 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.719159 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.719281 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.719433 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.719610 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.719692 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.719795 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.720479 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.721017 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.721172 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.722633 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hmlpq"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.723407 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.723651 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2r94k"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.724275 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.724623 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.724775 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.725172 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.730600 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.743309 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.745618 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.745854 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2xfws"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.747138 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hd4xd"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.748313 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.749020 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2ttnw"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.749442 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.751134 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.751153 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hd4xd" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.756183 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.757194 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.757309 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.757795 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775290 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66353c4f-6d67-4155-8b97-5f27145eabdd-etcd-client\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775340 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-image-import-ca\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775368 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775392 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775419 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6ec75d-41ec-4966-ab3c-03cf9f2497ce-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n49bl\" (UID: \"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775438 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6ec75d-41ec-4966-ab3c-03cf9f2497ce-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n49bl\" (UID: \"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775458 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52da690b-1ff6-4fb3-8c49-71a1fba78754-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775483 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e23d4b6-3f5b-4288-8753-cff09258a821-images\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775523 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-etcd-client\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775557 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-serving-cert\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775583 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e23d4b6-3f5b-4288-8753-cff09258a821-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775606 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk2sl\" (UniqueName: \"kubernetes.io/projected/1e23d4b6-3f5b-4288-8753-cff09258a821-kube-api-access-gk2sl\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775625 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775648 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-audit\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775677 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e6aa9841-3d4c-4600-9763-d32c243fa4d8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zbpgk\" (UID: \"e6aa9841-3d4c-4600-9763-d32c243fa4d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775697 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775713 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.776496 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.775730 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66353c4f-6d67-4155-8b97-5f27145eabdd-serving-cert\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.776733 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66353c4f-6d67-4155-8b97-5f27145eabdd-encryption-config\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.776817 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.776893 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mmg\" (UniqueName: \"kubernetes.io/projected/cc6ec75d-41ec-4966-ab3c-03cf9f2497ce-kube-api-access-28mmg\") pod \"openshift-apiserver-operator-796bbdcf4f-n49bl\" (UID: \"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.776978 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8qn8\" (UniqueName: \"kubernetes.io/projected/e6aa9841-3d4c-4600-9763-d32c243fa4d8-kube-api-access-h8qn8\") pod \"cluster-samples-operator-665b6dd947-zbpgk\" (UID: \"e6aa9841-3d4c-4600-9763-d32c243fa4d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777041 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777058 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e23d4b6-3f5b-4288-8753-cff09258a821-config\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777178 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66353c4f-6d67-4155-8b97-5f27145eabdd-audit-dir\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777241 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777305 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtmbf\" (UniqueName: \"kubernetes.io/projected/52da690b-1ff6-4fb3-8c49-71a1fba78754-kube-api-access-vtmbf\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.776926 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777555 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777416 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b0a663d-08c8-4198-a334-701d58beee58-config\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777742 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-client-ca\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777790 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g74kq\" (UniqueName: \"kubernetes.io/projected/66353c4f-6d67-4155-8b97-5f27145eabdd-kube-api-access-g74kq\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777814 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-dir\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777837 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-policies\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777856 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777878 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777901 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777922 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rsr4\" (UniqueName: \"kubernetes.io/projected/151c8409-e9e6-4a48-8e3d-661c0498cd86-kube-api-access-4rsr4\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777946 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777967 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-audit-policies\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.777987 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-encryption-config\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778008 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66353c4f-6d67-4155-8b97-5f27145eabdd-node-pullsecrets\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778038 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b0a663d-08c8-4198-a334-701d58beee58-auth-proxy-config\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778095 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778137 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rz64\" (UniqueName: \"kubernetes.io/projected/7b0a663d-08c8-4198-a334-701d58beee58-kube-api-access-6rz64\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778158 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52z9c\" (UniqueName: \"kubernetes.io/projected/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-kube-api-access-52z9c\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778188 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-audit-dir\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778224 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778250 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52da690b-1ff6-4fb3-8c49-71a1fba78754-service-ca-bundle\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778273 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52da690b-1ff6-4fb3-8c49-71a1fba78754-serving-cert\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778298 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-etcd-serving-ca\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778318 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-config\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778341 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dca6d86-e6aa-455e-88e8-f61e829b7efd-serving-cert\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778367 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xd7q\" (UniqueName: \"kubernetes.io/projected/6dca6d86-e6aa-455e-88e8-f61e829b7efd-kube-api-access-8xd7q\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778392 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-config\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778411 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52da690b-1ff6-4fb3-8c49-71a1fba78754-config\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778435 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7b0a663d-08c8-4198-a334-701d58beee58-machine-approver-tls\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778459 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.778478 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.779608 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pc6rv"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.780038 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.780119 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.783237 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.786596 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.787570 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.787853 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.788393 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.788491 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.805879 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.806244 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.806568 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.806693 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.806955 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.807704 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.807821 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.808125 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.813435 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bfxvx"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.822446 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.822912 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.822940 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h4xpz"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.822956 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2z5w6"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.822970 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.823440 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.823764 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.823954 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.835448 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.835496 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.835738 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.836082 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpfzj"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.836176 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.839371 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.840109 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cxbbt"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.840335 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.840530 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.843150 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-npvk6"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.843829 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvr8c"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.844320 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.844478 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.844395 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.845105 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.845293 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2xfws"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.854145 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jb77m"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.855369 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.855798 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.856306 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.856726 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.857493 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.874524 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.877266 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.877299 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.877697 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.877797 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.877914 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.878096 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.878174 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.879322 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.879402 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.879550 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.880204 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.880399 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.880782 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.880216 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.880981 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881165 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881421 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.880858 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mmg\" (UniqueName: \"kubernetes.io/projected/cc6ec75d-41ec-4966-ab3c-03cf9f2497ce-kube-api-access-28mmg\") pod \"openshift-apiserver-operator-796bbdcf4f-n49bl\" (UID: \"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881706 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66353c4f-6d67-4155-8b97-5f27145eabdd-serving-cert\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881735 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66353c4f-6d67-4155-8b97-5f27145eabdd-encryption-config\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881762 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881791 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e23d4b6-3f5b-4288-8753-cff09258a821-config\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881826 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8qn8\" (UniqueName: \"kubernetes.io/projected/e6aa9841-3d4c-4600-9763-d32c243fa4d8-kube-api-access-h8qn8\") pod \"cluster-samples-operator-665b6dd947-zbpgk\" (UID: \"e6aa9841-3d4c-4600-9763-d32c243fa4d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881846 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66353c4f-6d67-4155-8b97-5f27145eabdd-audit-dir\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881894 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881922 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtmbf\" (UniqueName: \"kubernetes.io/projected/52da690b-1ff6-4fb3-8c49-71a1fba78754-kube-api-access-vtmbf\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881958 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49ac5730-6f35-4a6b-9c54-df81528bdb81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lrfqf\" (UID: \"49ac5730-6f35-4a6b-9c54-df81528bdb81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.881997 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b0a663d-08c8-4198-a334-701d58beee58-config\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882021 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-client-ca\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882042 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-dir\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882058 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882070 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g74kq\" (UniqueName: \"kubernetes.io/projected/66353c4f-6d67-4155-8b97-5f27145eabdd-kube-api-access-g74kq\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882094 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-policies\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882119 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882142 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882171 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882202 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rsr4\" (UniqueName: \"kubernetes.io/projected/151c8409-e9e6-4a48-8e3d-661c0498cd86-kube-api-access-4rsr4\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882230 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882257 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-audit-policies\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882262 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882281 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66353c4f-6d67-4155-8b97-5f27145eabdd-node-pullsecrets\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882303 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-encryption-config\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882334 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b0a663d-08c8-4198-a334-701d58beee58-auth-proxy-config\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882355 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rz64\" (UniqueName: \"kubernetes.io/projected/7b0a663d-08c8-4198-a334-701d58beee58-kube-api-access-6rz64\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882375 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52z9c\" (UniqueName: \"kubernetes.io/projected/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-kube-api-access-52z9c\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882393 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882433 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-audit-dir\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882455 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.883824 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.882454 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52da690b-1ff6-4fb3-8c49-71a1fba78754-service-ca-bundle\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884034 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884069 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52da690b-1ff6-4fb3-8c49-71a1fba78754-serving-cert\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884101 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-etcd-serving-ca\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884123 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-config\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884142 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dca6d86-e6aa-455e-88e8-f61e829b7efd-serving-cert\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884176 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ac5730-6f35-4a6b-9c54-df81528bdb81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lrfqf\" (UID: \"49ac5730-6f35-4a6b-9c54-df81528bdb81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884210 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xd7q\" (UniqueName: \"kubernetes.io/projected/6dca6d86-e6aa-455e-88e8-f61e829b7efd-kube-api-access-8xd7q\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884244 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-config\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884264 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52da690b-1ff6-4fb3-8c49-71a1fba78754-config\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884293 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7b0a663d-08c8-4198-a334-701d58beee58-machine-approver-tls\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884321 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884348 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884377 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884402 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66353c4f-6d67-4155-8b97-5f27145eabdd-etcd-client\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884427 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-image-import-ca\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884454 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884485 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2jtf\" (UniqueName: \"kubernetes.io/projected/49ac5730-6f35-4a6b-9c54-df81528bdb81-kube-api-access-f2jtf\") pod \"openshift-controller-manager-operator-756b6f6bc6-lrfqf\" (UID: \"49ac5730-6f35-4a6b-9c54-df81528bdb81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884513 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6ec75d-41ec-4966-ab3c-03cf9f2497ce-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n49bl\" (UID: \"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884561 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6ec75d-41ec-4966-ab3c-03cf9f2497ce-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n49bl\" (UID: \"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884593 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52da690b-1ff6-4fb3-8c49-71a1fba78754-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884641 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e23d4b6-3f5b-4288-8753-cff09258a821-images\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884679 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e23d4b6-3f5b-4288-8753-cff09258a821-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884709 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-etcd-client\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884739 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-serving-cert\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884763 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk2sl\" (UniqueName: \"kubernetes.io/projected/1e23d4b6-3f5b-4288-8753-cff09258a821-kube-api-access-gk2sl\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884794 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884828 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-audit\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.884877 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e6aa9841-3d4c-4600-9763-d32c243fa4d8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zbpgk\" (UID: \"e6aa9841-3d4c-4600-9763-d32c243fa4d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.888279 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-audit-policies\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.889579 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.890110 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66353c4f-6d67-4155-8b97-5f27145eabdd-node-pullsecrets\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.890242 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-policies\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.890431 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.890721 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.890895 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.891106 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.891285 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.894166 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.894587 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.894750 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hmvqm"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.895365 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52da690b-1ff6-4fb3-8c49-71a1fba78754-service-ca-bundle\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.895406 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e23d4b6-3f5b-4288-8753-cff09258a821-images\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.895788 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52da690b-1ff6-4fb3-8c49-71a1fba78754-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.898983 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e6aa9841-3d4c-4600-9763-d32c243fa4d8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zbpgk\" (UID: \"e6aa9841-3d4c-4600-9763-d32c243fa4d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.899167 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.899333 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.899881 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bfxvx"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.900236 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.900672 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b0a663d-08c8-4198-a334-701d58beee58-config\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.901087 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.902431 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b0a663d-08c8-4198-a334-701d58beee58-auth-proxy-config\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.903118 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-audit-dir\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.905201 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-audit\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.905629 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-image-import-ca\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.906098 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66353c4f-6d67-4155-8b97-5f27145eabdd-serving-cert\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.906921 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66353c4f-6d67-4155-8b97-5f27145eabdd-audit-dir\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.907359 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66353c4f-6d67-4155-8b97-5f27145eabdd-encryption-config\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.907557 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6ec75d-41ec-4966-ab3c-03cf9f2497ce-config\") pod \"openshift-apiserver-operator-796bbdcf4f-n49bl\" (UID: \"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.908124 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.908734 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-encryption-config\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.910441 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e23d4b6-3f5b-4288-8753-cff09258a821-config\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.911124 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6ec75d-41ec-4966-ab3c-03cf9f2497ce-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-n49bl\" (UID: \"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.918741 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-config\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.937220 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52da690b-1ff6-4fb3-8c49-71a1fba78754-config\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.937328 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.937457 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e23d4b6-3f5b-4288-8753-cff09258a821-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.937599 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.937942 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-serving-cert\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.937950 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-etcd-client\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.938115 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.938322 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.938758 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.938980 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.939087 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.939430 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66353c4f-6d67-4155-8b97-5f27145eabdd-etcd-client\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.939493 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.939859 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.940051 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.940243 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.940569 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.940759 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.940867 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-dir\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.943527 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.943645 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.944292 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-client-ca\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.944455 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dca6d86-e6aa-455e-88e8-f61e829b7efd-serving-cert\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.945080 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kpss8"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.945763 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hmlpq"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.945846 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.945926 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.946011 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.945609 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-etcd-serving-ca\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.946622 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.946991 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.947152 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-config\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.947269 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.947286 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66353c4f-6d67-4155-8b97-5f27145eabdd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.948070 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hd4xd"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.948673 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52da690b-1ff6-4fb3-8c49-71a1fba78754-serving-cert\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.949418 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.950075 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.951101 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.952065 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2ttnw"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.953087 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2r94k"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.954448 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.955323 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9qr74"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.956873 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7b0a663d-08c8-4198-a334-701d58beee58-machine-approver-tls\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.957337 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.957365 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8czpq"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.957651 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.957939 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8czpq" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.959337 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.960530 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.961634 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.963007 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.963685 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.964688 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.965716 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.966723 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.968257 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8czpq"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.970267 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvr8c"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.970364 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-npvk6"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.970496 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.971653 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jb77m"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.972566 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-795lz"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.973652 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-795lz"] Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.973822 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-795lz" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.984446 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.986068 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2jtf\" (UniqueName: \"kubernetes.io/projected/49ac5730-6f35-4a6b-9c54-df81528bdb81-kube-api-access-f2jtf\") pod \"openshift-controller-manager-operator-756b6f6bc6-lrfqf\" (UID: \"49ac5730-6f35-4a6b-9c54-df81528bdb81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.986174 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49ac5730-6f35-4a6b-9c54-df81528bdb81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lrfqf\" (UID: \"49ac5730-6f35-4a6b-9c54-df81528bdb81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:53 crc kubenswrapper[5025]: I1007 08:18:53.986264 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ac5730-6f35-4a6b-9c54-df81528bdb81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lrfqf\" (UID: \"49ac5730-6f35-4a6b-9c54-df81528bdb81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.005516 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.025048 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.046034 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.047439 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ac5730-6f35-4a6b-9c54-df81528bdb81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lrfqf\" (UID: \"49ac5730-6f35-4a6b-9c54-df81528bdb81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.064932 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.084742 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.105593 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.125861 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.130834 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49ac5730-6f35-4a6b-9c54-df81528bdb81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lrfqf\" (UID: \"49ac5730-6f35-4a6b-9c54-df81528bdb81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.150823 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.164915 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.184669 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.205769 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.226815 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.245958 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.266031 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.289366 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.306134 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.325306 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.345420 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.366397 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.385073 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.405982 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.426058 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.445667 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.473904 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.485098 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.504841 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.526182 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.545625 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.565284 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.585224 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.606649 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.625852 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.645264 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.685658 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.705821 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.724884 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.746270 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.764942 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.784999 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.806061 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.825670 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.843232 5025 request.go:700] Waited for 1.01926937s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-admission-controller-secret&limit=500&resourceVersion=0 Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.846033 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.866139 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.885134 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.905161 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.925113 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.945028 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.965702 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 07 08:18:54 crc kubenswrapper[5025]: I1007 08:18:54.985685 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.005899 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.026573 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.045648 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.065641 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.086591 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.105986 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.125116 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.145863 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.165503 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.186344 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.205523 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.225331 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.252884 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.265531 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.285770 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.306385 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.326042 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.344775 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.366454 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.385964 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.406448 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.425432 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.446371 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.466054 5025 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.486093 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.538409 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g74kq\" (UniqueName: \"kubernetes.io/projected/66353c4f-6d67-4155-8b97-5f27145eabdd-kube-api-access-g74kq\") pod \"apiserver-76f77b778f-2z5w6\" (UID: \"66353c4f-6d67-4155-8b97-5f27145eabdd\") " pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.542736 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mmg\" (UniqueName: \"kubernetes.io/projected/cc6ec75d-41ec-4966-ab3c-03cf9f2497ce-kube-api-access-28mmg\") pod \"openshift-apiserver-operator-796bbdcf4f-n49bl\" (UID: \"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.564166 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rsr4\" (UniqueName: \"kubernetes.io/projected/151c8409-e9e6-4a48-8e3d-661c0498cd86-kube-api-access-4rsr4\") pod \"oauth-openshift-558db77b4-hpfzj\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.577328 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.597244 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rz64\" (UniqueName: \"kubernetes.io/projected/7b0a663d-08c8-4198-a334-701d58beee58-kube-api-access-6rz64\") pod \"machine-approver-56656f9798-ck97s\" (UID: \"7b0a663d-08c8-4198-a334-701d58beee58\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.617358 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52z9c\" (UniqueName: \"kubernetes.io/projected/ad008aba-a6da-410f-82b5-a18f8dd3d5c7-kube-api-access-52z9c\") pod \"apiserver-7bbb656c7d-qpzs2\" (UID: \"ad008aba-a6da-410f-82b5-a18f8dd3d5c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.624678 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8qn8\" (UniqueName: \"kubernetes.io/projected/e6aa9841-3d4c-4600-9763-d32c243fa4d8-kube-api-access-h8qn8\") pod \"cluster-samples-operator-665b6dd947-zbpgk\" (UID: \"e6aa9841-3d4c-4600-9763-d32c243fa4d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.639480 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.642090 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk2sl\" (UniqueName: \"kubernetes.io/projected/1e23d4b6-3f5b-4288-8753-cff09258a821-kube-api-access-gk2sl\") pod \"machine-api-operator-5694c8668f-cxbbt\" (UID: \"1e23d4b6-3f5b-4288-8753-cff09258a821\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.654902 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.683501 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.689608 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtmbf\" (UniqueName: \"kubernetes.io/projected/52da690b-1ff6-4fb3-8c49-71a1fba78754-kube-api-access-vtmbf\") pod \"authentication-operator-69f744f599-hmvqm\" (UID: \"52da690b-1ff6-4fb3-8c49-71a1fba78754\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.702615 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xd7q\" (UniqueName: \"kubernetes.io/projected/6dca6d86-e6aa-455e-88e8-f61e829b7efd-kube-api-access-8xd7q\") pod \"route-controller-manager-6576b87f9c-vk2xm\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.705351 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.726440 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.741806 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.745328 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.765274 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.765727 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.785513 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.802725 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.804728 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.813257 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2z5w6"] Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.825780 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 07 08:18:55 crc kubenswrapper[5025]: W1007 08:18:55.830281 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66353c4f_6d67_4155_8b97_5f27145eabdd.slice/crio-de375ec75f7dab0814d500873f363dfb8850df59605c15fa0c083aab87134811 WatchSource:0}: Error finding container de375ec75f7dab0814d500873f363dfb8850df59605c15fa0c083aab87134811: Status 404 returned error can't find the container with id de375ec75f7dab0814d500873f363dfb8850df59605c15fa0c083aab87134811 Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.844806 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.847681 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.862903 5025 request.go:700] Waited for 1.888483196s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.867099 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.868639 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cxbbt"] Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.886332 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.900927 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.928119 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2jtf\" (UniqueName: \"kubernetes.io/projected/49ac5730-6f35-4a6b-9c54-df81528bdb81-kube-api-access-f2jtf\") pod \"openshift-controller-manager-operator-756b6f6bc6-lrfqf\" (UID: \"49ac5730-6f35-4a6b-9c54-df81528bdb81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.931077 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2"] Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.934567 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:18:55 crc kubenswrapper[5025]: I1007 08:18:55.934641 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:18:55 crc kubenswrapper[5025]: W1007 08:18:55.955694 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad008aba_a6da_410f_82b5_a18f8dd3d5c7.slice/crio-a47f53846475bf0264769b489b7ab913854463c7bc9c8bd1478e3353eb409409 WatchSource:0}: Error finding container a47f53846475bf0264769b489b7ab913854463c7bc9c8bd1478e3353eb409409: Status 404 returned error can't find the container with id a47f53846475bf0264769b489b7ab913854463c7bc9c8bd1478e3353eb409409 Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.002360 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl"] Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021109 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d71f73a-264b-459b-9d99-100a71204e60-trusted-ca\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021176 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-client-ca\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021211 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2010515f-af73-42cf-a3fd-e2a76508e9e0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021258 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-stats-auth\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021283 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6846b045-d258-4dfe-8d50-1bc89b47389c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ltz52\" (UID: \"6846b045-d258-4dfe-8d50-1bc89b47389c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021346 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16df6e82-291b-4885-86a8-50276b3e7ef2-trusted-ca\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021382 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-trusted-ca\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021424 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021517 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnnf\" (UniqueName: \"kubernetes.io/projected/9d71f73a-264b-459b-9d99-100a71204e60-kube-api-access-nhnnf\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021570 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-config\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.021651 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-oauth-serving-cert\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.022525 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-serving-cert\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.022603 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f09299f-24a4-4d5f-8ca9-704c678b8d23-serving-cert\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.022679 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxlfz\" (UniqueName: \"kubernetes.io/projected/109ab8c4-3dcd-49a9-a966-4ad68758f46a-kube-api-access-zxlfz\") pod \"control-plane-machine-set-operator-78cbb6b69f-8lwpw\" (UID: \"109ab8c4-3dcd-49a9-a966-4ad68758f46a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.022791 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/109ab8c4-3dcd-49a9-a966-4ad68758f46a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8lwpw\" (UID: \"109ab8c4-3dcd-49a9-a966-4ad68758f46a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.022908 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2010515f-af73-42cf-a3fd-e2a76508e9e0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.022949 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6846b045-d258-4dfe-8d50-1bc89b47389c-config\") pod \"kube-apiserver-operator-766d6c64bb-ltz52\" (UID: \"6846b045-d258-4dfe-8d50-1bc89b47389c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.023725 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d-metrics-tls\") pod \"dns-operator-744455d44c-kpss8\" (UID: \"fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.023772 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc8vs\" (UniqueName: \"kubernetes.io/projected/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-kube-api-access-kc8vs\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.023803 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-etcd-service-ca\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.023862 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-etcd-client\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.024720 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-registry-tls\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.024764 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-etcd-ca\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.024859 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvrm2\" (UniqueName: \"kubernetes.io/projected/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-kube-api-access-gvrm2\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.024998 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9m92\" (UniqueName: \"kubernetes.io/projected/debaf386-9a77-4774-b7fe-b439a1406621-kube-api-access-r9m92\") pod \"package-server-manager-789f6589d5-sjgwg\" (UID: \"debaf386-9a77-4774-b7fe-b439a1406621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.025038 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-default-certificate\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.025071 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2bf1fe9-f8a4-4adb-a141-45bd491e0268-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vkbl6\" (UID: \"c2bf1fe9-f8a4-4adb-a141-45bd491e0268\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.025151 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r89h4\" (UniqueName: \"kubernetes.io/projected/1e02ba47-76b7-4363-8ade-a9f8c42db920-kube-api-access-r89h4\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.025176 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk4t9\" (UniqueName: \"kubernetes.io/projected/3376ab1d-924e-4d39-a393-c4f7a452845b-kube-api-access-fk4t9\") pod \"openshift-config-operator-7777fb866f-lsrzs\" (UID: \"3376ab1d-924e-4d39-a393-c4f7a452845b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.025852 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/debaf386-9a77-4774-b7fe-b439a1406621-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sjgwg\" (UID: \"debaf386-9a77-4774-b7fe-b439a1406621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.026617 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-config\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.026680 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3376ab1d-924e-4d39-a393-c4f7a452845b-serving-cert\") pod \"openshift-config-operator-7777fb866f-lsrzs\" (UID: \"3376ab1d-924e-4d39-a393-c4f7a452845b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.026731 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7krj\" (UniqueName: \"kubernetes.io/projected/cafb7a37-dac6-45f8-9bf9-d8820d7dedaa-kube-api-access-d7krj\") pod \"machine-config-controller-84d6567774-5m9cc\" (UID: \"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.026784 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-service-ca-bundle\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.026862 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxscm\" (UniqueName: \"kubernetes.io/projected/a31617bb-030b-48d7-b312-e7af9b052143-kube-api-access-pxscm\") pod \"downloads-7954f5f757-hd4xd\" (UID: \"a31617bb-030b-48d7-b312-e7af9b052143\") " pod="openshift-console/downloads-7954f5f757-hd4xd" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.026890 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3376ab1d-924e-4d39-a393-c4f7a452845b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lsrzs\" (UID: \"3376ab1d-924e-4d39-a393-c4f7a452845b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.026922 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6846b045-d258-4dfe-8d50-1bc89b47389c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ltz52\" (UID: \"6846b045-d258-4dfe-8d50-1bc89b47389c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.026943 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-service-ca\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.026998 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pz9g\" (UniqueName: \"kubernetes.io/projected/2010515f-af73-42cf-a3fd-e2a76508e9e0-kube-api-access-8pz9g\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027024 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2bf1fe9-f8a4-4adb-a141-45bd491e0268-config\") pod \"kube-controller-manager-operator-78b949d7b-vkbl6\" (UID: \"c2bf1fe9-f8a4-4adb-a141-45bd491e0268\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027277 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d11df844-49f0-4c9a-9de5-701e57c69685-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027306 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cafb7a37-dac6-45f8-9bf9-d8820d7dedaa-proxy-tls\") pod \"machine-config-controller-84d6567774-5m9cc\" (UID: \"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027630 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsmb2\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-kube-api-access-gsmb2\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027712 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027738 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-oauth-config\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027797 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16df6e82-291b-4885-86a8-50276b3e7ef2-serving-cert\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027823 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztz47\" (UniqueName: \"kubernetes.io/projected/16df6e82-291b-4885-86a8-50276b3e7ef2-kube-api-access-ztz47\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027844 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-trusted-ca-bundle\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027863 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89lbm\" (UniqueName: \"kubernetes.io/projected/fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d-kube-api-access-89lbm\") pod \"dns-operator-744455d44c-kpss8\" (UID: \"fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027885 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-metrics-certs\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027917 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-bound-sa-token\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027941 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16df6e82-291b-4885-86a8-50276b3e7ef2-config\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027969 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2010515f-af73-42cf-a3fd-e2a76508e9e0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.027990 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-serving-cert\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.028012 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-config\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.028037 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d71f73a-264b-459b-9d99-100a71204e60-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.028061 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-registry-certificates\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.028094 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fbrj\" (UniqueName: \"kubernetes.io/projected/0f09299f-24a4-4d5f-8ca9-704c678b8d23-kube-api-access-9fbrj\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.028120 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d11df844-49f0-4c9a-9de5-701e57c69685-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.028149 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cafb7a37-dac6-45f8-9bf9-d8820d7dedaa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5m9cc\" (UID: \"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.028181 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:56.528165097 +0000 UTC m=+143.337479241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.029348 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d71f73a-264b-459b-9d99-100a71204e60-metrics-tls\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.029390 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2bf1fe9-f8a4-4adb-a141-45bd491e0268-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vkbl6\" (UID: \"c2bf1fe9-f8a4-4adb-a141-45bd491e0268\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.034179 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hmvqm"] Oct 07 08:18:56 crc kubenswrapper[5025]: W1007 08:18:56.071961 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc6ec75d_41ec_4966_ab3c_03cf9f2497ce.slice/crio-db4db9027179c86ee1fee5222948ee0a8c161c06e2919c325458fc473da3237e WatchSource:0}: Error finding container db4db9027179c86ee1fee5222948ee0a8c161c06e2919c325458fc473da3237e: Status 404 returned error can't find the container with id db4db9027179c86ee1fee5222948ee0a8c161c06e2919c325458fc473da3237e Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.116676 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk"] Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130355 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130522 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-etcd-client\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130591 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rvr8c\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130636 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a3782b5-7407-485b-a426-58413aba5747-apiservice-cert\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130671 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvrm2\" (UniqueName: \"kubernetes.io/projected/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-kube-api-access-gvrm2\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130693 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-registry-tls\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130713 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-etcd-ca\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130739 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26h62\" (UniqueName: \"kubernetes.io/projected/6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2-kube-api-access-26h62\") pod \"catalog-operator-68c6474976-wcnzt\" (UID: \"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130760 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25634103-cd17-4005-8c8e-575d58e0826e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xf6r2\" (UID: \"25634103-cd17-4005-8c8e-575d58e0826e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130779 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j9tr\" (UniqueName: \"kubernetes.io/projected/94b36961-ca5f-4fac-a744-118761859a72-kube-api-access-8j9tr\") pod \"ingress-canary-8czpq\" (UID: \"94b36961-ca5f-4fac-a744-118761859a72\") " pod="openshift-ingress-canary/ingress-canary-8czpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130802 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9m92\" (UniqueName: \"kubernetes.io/projected/debaf386-9a77-4774-b7fe-b439a1406621-kube-api-access-r9m92\") pod \"package-server-manager-789f6589d5-sjgwg\" (UID: \"debaf386-9a77-4774-b7fe-b439a1406621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130826 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-default-certificate\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130851 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-plugins-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130872 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2bf1fe9-f8a4-4adb-a141-45bd491e0268-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vkbl6\" (UID: \"c2bf1fe9-f8a4-4adb-a141-45bd491e0268\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130920 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r89h4\" (UniqueName: \"kubernetes.io/projected/1e02ba47-76b7-4363-8ade-a9f8c42db920-kube-api-access-r89h4\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130941 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk4t9\" (UniqueName: \"kubernetes.io/projected/3376ab1d-924e-4d39-a393-c4f7a452845b-kube-api-access-fk4t9\") pod \"openshift-config-operator-7777fb866f-lsrzs\" (UID: \"3376ab1d-924e-4d39-a393-c4f7a452845b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130964 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/debaf386-9a77-4774-b7fe-b439a1406621-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sjgwg\" (UID: \"debaf386-9a77-4774-b7fe-b439a1406621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.130988 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb37bc8-4d3f-438c-850d-3b800b786f95-config\") pod \"service-ca-operator-777779d784-b6pkw\" (UID: \"9fb37bc8-4d3f-438c-850d-3b800b786f95\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131011 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp4ds\" (UniqueName: \"kubernetes.io/projected/4ea4607c-59be-4461-8fa4-9e22b87036f8-kube-api-access-sp4ds\") pod \"kube-storage-version-migrator-operator-b67b599dd-29wzh\" (UID: \"4ea4607c-59be-4461-8fa4-9e22b87036f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.131034 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:56.63101087 +0000 UTC m=+143.440325084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131080 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-config\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131117 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/77dd254d-69a1-4f60-9587-e59f9094626a-signing-key\") pod \"service-ca-9c57cc56f-npvk6\" (UID: \"77dd254d-69a1-4f60-9587-e59f9094626a\") " pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131140 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl4m9\" (UniqueName: \"kubernetes.io/projected/52a8a68e-3997-4206-b8df-3650699f2ed7-kube-api-access-zl4m9\") pod \"machine-config-server-9qr74\" (UID: \"52a8a68e-3997-4206-b8df-3650699f2ed7\") " pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131165 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3376ab1d-924e-4d39-a393-c4f7a452845b-serving-cert\") pod \"openshift-config-operator-7777fb866f-lsrzs\" (UID: \"3376ab1d-924e-4d39-a393-c4f7a452845b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131184 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk54x\" (UniqueName: \"kubernetes.io/projected/6a3782b5-7407-485b-a426-58413aba5747-kube-api-access-rk54x\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131204 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/05686c29-75d0-4321-8136-39f8600afac1-srv-cert\") pod \"olm-operator-6b444d44fb-fp4pg\" (UID: \"05686c29-75d0-4321-8136-39f8600afac1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131225 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-service-ca-bundle\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131261 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7krj\" (UniqueName: \"kubernetes.io/projected/cafb7a37-dac6-45f8-9bf9-d8820d7dedaa-kube-api-access-d7krj\") pod \"machine-config-controller-84d6567774-5m9cc\" (UID: \"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131282 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxscm\" (UniqueName: \"kubernetes.io/projected/a31617bb-030b-48d7-b312-e7af9b052143-kube-api-access-pxscm\") pod \"downloads-7954f5f757-hd4xd\" (UID: \"a31617bb-030b-48d7-b312-e7af9b052143\") " pod="openshift-console/downloads-7954f5f757-hd4xd" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131306 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-registration-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131329 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25634103-cd17-4005-8c8e-575d58e0826e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xf6r2\" (UID: \"25634103-cd17-4005-8c8e-575d58e0826e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131354 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3376ab1d-924e-4d39-a393-c4f7a452845b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lsrzs\" (UID: \"3376ab1d-924e-4d39-a393-c4f7a452845b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131377 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6846b045-d258-4dfe-8d50-1bc89b47389c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ltz52\" (UID: \"6846b045-d258-4dfe-8d50-1bc89b47389c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131394 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-service-ca\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131410 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702301a4-cf98-4786-8649-d6e396c775a4-config-volume\") pod \"dns-default-795lz\" (UID: \"702301a4-cf98-4786-8649-d6e396c775a4\") " pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131444 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pz9g\" (UniqueName: \"kubernetes.io/projected/2010515f-af73-42cf-a3fd-e2a76508e9e0-kube-api-access-8pz9g\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131466 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2bf1fe9-f8a4-4adb-a141-45bd491e0268-config\") pod \"kube-controller-manager-operator-78b949d7b-vkbl6\" (UID: \"c2bf1fe9-f8a4-4adb-a141-45bd491e0268\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131491 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d11df844-49f0-4c9a-9de5-701e57c69685-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131511 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cafb7a37-dac6-45f8-9bf9-d8820d7dedaa-proxy-tls\") pod \"machine-config-controller-84d6567774-5m9cc\" (UID: \"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131621 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsmb2\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-kube-api-access-gsmb2\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131644 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2-profile-collector-cert\") pod \"catalog-operator-68c6474976-wcnzt\" (UID: \"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131663 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99b101e0-bca4-40fe-942f-7ab5d06be284-images\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131709 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131734 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-oauth-config\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131756 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-csi-data-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131792 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16df6e82-291b-4885-86a8-50276b3e7ef2-serving-cert\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131796 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-etcd-ca\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131825 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw5xc\" (UniqueName: \"kubernetes.io/projected/385983c4-874d-4095-980c-c2d3763ce8e1-kube-api-access-mw5xc\") pod \"collect-profiles-29330415-w9ctz\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.131850 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-trusted-ca-bundle\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.132183 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:56.632174018 +0000 UTC m=+143.441488162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.132907 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/385983c4-874d-4095-980c-c2d3763ce8e1-secret-volume\") pod \"collect-profiles-29330415-w9ctz\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.132979 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztz47\" (UniqueName: \"kubernetes.io/projected/16df6e82-291b-4885-86a8-50276b3e7ef2-kube-api-access-ztz47\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.133003 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89lbm\" (UniqueName: \"kubernetes.io/projected/fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d-kube-api-access-89lbm\") pod \"dns-operator-744455d44c-kpss8\" (UID: \"fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.133022 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-metrics-certs\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.133045 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kmxl\" (UniqueName: \"kubernetes.io/projected/77dd254d-69a1-4f60-9587-e59f9094626a-kube-api-access-2kmxl\") pod \"service-ca-9c57cc56f-npvk6\" (UID: \"77dd254d-69a1-4f60-9587-e59f9094626a\") " pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.133792 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2bf1fe9-f8a4-4adb-a141-45bd491e0268-config\") pod \"kube-controller-manager-operator-78b949d7b-vkbl6\" (UID: \"c2bf1fe9-f8a4-4adb-a141-45bd491e0268\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.134707 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-service-ca-bundle\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.134710 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-trusted-ca-bundle\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.135149 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-config\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.135561 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea4607c-59be-4461-8fa4-9e22b87036f8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-29wzh\" (UID: \"4ea4607c-59be-4461-8fa4-9e22b87036f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.137224 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-service-ca\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.137796 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-etcd-client\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.138128 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d11df844-49f0-4c9a-9de5-701e57c69685-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.138607 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3376ab1d-924e-4d39-a393-c4f7a452845b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lsrzs\" (UID: \"3376ab1d-924e-4d39-a393-c4f7a452845b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.135600 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25634103-cd17-4005-8c8e-575d58e0826e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xf6r2\" (UID: \"25634103-cd17-4005-8c8e-575d58e0826e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.139316 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/52a8a68e-3997-4206-b8df-3650699f2ed7-certs\") pod \"machine-config-server-9qr74\" (UID: \"52a8a68e-3997-4206-b8df-3650699f2ed7\") " pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.139371 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-bound-sa-token\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.139392 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16df6e82-291b-4885-86a8-50276b3e7ef2-config\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140232 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16df6e82-291b-4885-86a8-50276b3e7ef2-config\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.139409 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-config\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140332 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2-srv-cert\") pod \"catalog-operator-68c6474976-wcnzt\" (UID: \"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140351 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99b101e0-bca4-40fe-942f-7ab5d06be284-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140375 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2010515f-af73-42cf-a3fd-e2a76508e9e0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140395 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-serving-cert\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140413 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d71f73a-264b-459b-9d99-100a71204e60-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140429 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a3782b5-7407-485b-a426-58413aba5747-webhook-cert\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140459 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-registry-certificates\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140477 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fbrj\" (UniqueName: \"kubernetes.io/projected/0f09299f-24a4-4d5f-8ca9-704c678b8d23-kube-api-access-9fbrj\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140497 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cafb7a37-dac6-45f8-9bf9-d8820d7dedaa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5m9cc\" (UID: \"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140513 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/702301a4-cf98-4786-8649-d6e396c775a4-metrics-tls\") pod \"dns-default-795lz\" (UID: \"702301a4-cf98-4786-8649-d6e396c775a4\") " pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140560 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d11df844-49f0-4c9a-9de5-701e57c69685-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140579 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlxzq\" (UniqueName: \"kubernetes.io/projected/9ae035c8-618a-4a3b-afbc-e8173548cb32-kube-api-access-hlxzq\") pod \"multus-admission-controller-857f4d67dd-bfxvx\" (UID: \"9ae035c8-618a-4a3b-afbc-e8173548cb32\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140601 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb37bc8-4d3f-438c-850d-3b800b786f95-serving-cert\") pod \"service-ca-operator-777779d784-b6pkw\" (UID: \"9fb37bc8-4d3f-438c-850d-3b800b786f95\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140634 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d71f73a-264b-459b-9d99-100a71204e60-metrics-tls\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140653 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2bf1fe9-f8a4-4adb-a141-45bd491e0268-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vkbl6\" (UID: \"c2bf1fe9-f8a4-4adb-a141-45bd491e0268\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140675 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddzh8\" (UniqueName: \"kubernetes.io/projected/e8ebca0a-96ab-452a-a93e-7041967c40d1-kube-api-access-ddzh8\") pod \"migrator-59844c95c7-x9t4l\" (UID: \"e8ebca0a-96ab-452a-a93e-7041967c40d1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140701 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8mj\" (UniqueName: \"kubernetes.io/projected/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-kube-api-access-zp8mj\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140718 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d71f73a-264b-459b-9d99-100a71204e60-trusted-ca\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140733 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-client-ca\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140760 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2010515f-af73-42cf-a3fd-e2a76508e9e0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140787 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-stats-auth\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140804 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-mountpoint-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140822 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6846b045-d258-4dfe-8d50-1bc89b47389c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ltz52\" (UID: \"6846b045-d258-4dfe-8d50-1bc89b47389c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140860 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16df6e82-291b-4885-86a8-50276b3e7ef2-trusted-ca\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140878 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a3782b5-7407-485b-a426-58413aba5747-tmpfs\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140896 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smrm\" (UniqueName: \"kubernetes.io/projected/9fb37bc8-4d3f-438c-850d-3b800b786f95-kube-api-access-5smrm\") pod \"service-ca-operator-777779d784-b6pkw\" (UID: \"9fb37bc8-4d3f-438c-850d-3b800b786f95\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140910 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94b36961-ca5f-4fac-a744-118761859a72-cert\") pod \"ingress-canary-8czpq\" (UID: \"94b36961-ca5f-4fac-a744-118761859a72\") " pod="openshift-ingress-canary/ingress-canary-8czpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140928 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ea4607c-59be-4461-8fa4-9e22b87036f8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-29wzh\" (UID: \"4ea4607c-59be-4461-8fa4-9e22b87036f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140944 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-trusted-ca\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140960 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.140993 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-config\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141009 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4lp\" (UniqueName: \"kubernetes.io/projected/05686c29-75d0-4321-8136-39f8600afac1-kube-api-access-zr4lp\") pod \"olm-operator-6b444d44fb-fp4pg\" (UID: \"05686c29-75d0-4321-8136-39f8600afac1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141024 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/52a8a68e-3997-4206-b8df-3650699f2ed7-node-bootstrap-token\") pod \"machine-config-server-9qr74\" (UID: \"52a8a68e-3997-4206-b8df-3650699f2ed7\") " pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141050 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnnf\" (UniqueName: \"kubernetes.io/projected/9d71f73a-264b-459b-9d99-100a71204e60-kube-api-access-nhnnf\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141076 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-oauth-serving-cert\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141092 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/77dd254d-69a1-4f60-9587-e59f9094626a-signing-cabundle\") pod \"service-ca-9c57cc56f-npvk6\" (UID: \"77dd254d-69a1-4f60-9587-e59f9094626a\") " pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141109 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rvr8c\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141134 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9ae035c8-618a-4a3b-afbc-e8173548cb32-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bfxvx\" (UID: \"9ae035c8-618a-4a3b-afbc-e8173548cb32\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141151 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-serving-cert\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141169 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f09299f-24a4-4d5f-8ca9-704c678b8d23-serving-cert\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141197 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z4ft\" (UniqueName: \"kubernetes.io/projected/99b101e0-bca4-40fe-942f-7ab5d06be284-kube-api-access-9z4ft\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141222 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxlfz\" (UniqueName: \"kubernetes.io/projected/109ab8c4-3dcd-49a9-a966-4ad68758f46a-kube-api-access-zxlfz\") pod \"control-plane-machine-set-operator-78cbb6b69f-8lwpw\" (UID: \"109ab8c4-3dcd-49a9-a966-4ad68758f46a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141239 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4wb\" (UniqueName: \"kubernetes.io/projected/b3110e75-2479-4c2b-b96b-0f41a1f10cec-kube-api-access-gm4wb\") pod \"marketplace-operator-79b997595-rvr8c\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141260 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/109ab8c4-3dcd-49a9-a966-4ad68758f46a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8lwpw\" (UID: \"109ab8c4-3dcd-49a9-a966-4ad68758f46a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141279 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2010515f-af73-42cf-a3fd-e2a76508e9e0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141294 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99b101e0-bca4-40fe-942f-7ab5d06be284-proxy-tls\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141321 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6846b045-d258-4dfe-8d50-1bc89b47389c-config\") pod \"kube-apiserver-operator-766d6c64bb-ltz52\" (UID: \"6846b045-d258-4dfe-8d50-1bc89b47389c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141339 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cafb7a37-dac6-45f8-9bf9-d8820d7dedaa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5m9cc\" (UID: \"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141346 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/385983c4-874d-4095-980c-c2d3763ce8e1-config-volume\") pod \"collect-profiles-29330415-w9ctz\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141651 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d11df844-49f0-4c9a-9de5-701e57c69685-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.141683 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2010515f-af73-42cf-a3fd-e2a76508e9e0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.142333 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpfzj"] Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.146206 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-config\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.146446 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-config\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.146999 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-metrics-certs\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.147185 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.147337 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f09299f-24a4-4d5f-8ca9-704c678b8d23-serving-cert\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.147612 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16df6e82-291b-4885-86a8-50276b3e7ef2-serving-cert\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.147693 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-registry-certificates\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.147810 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6846b045-d258-4dfe-8d50-1bc89b47389c-config\") pod \"kube-apiserver-operator-766d6c64bb-ltz52\" (UID: \"6846b045-d258-4dfe-8d50-1bc89b47389c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.148630 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-trusted-ca\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.148657 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.148686 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-socket-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.148710 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g764n\" (UniqueName: \"kubernetes.io/projected/702301a4-cf98-4786-8649-d6e396c775a4-kube-api-access-g764n\") pod \"dns-default-795lz\" (UID: \"702301a4-cf98-4786-8649-d6e396c775a4\") " pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.149101 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-client-ca\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.149352 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-default-certificate\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.149683 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-serving-cert\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.150158 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16df6e82-291b-4885-86a8-50276b3e7ef2-trusted-ca\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.150303 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d71f73a-264b-459b-9d99-100a71204e60-trusted-ca\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.150972 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2bf1fe9-f8a4-4adb-a141-45bd491e0268-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vkbl6\" (UID: \"c2bf1fe9-f8a4-4adb-a141-45bd491e0268\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.151065 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/debaf386-9a77-4774-b7fe-b439a1406621-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sjgwg\" (UID: \"debaf386-9a77-4774-b7fe-b439a1406621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.151168 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d-metrics-tls\") pod \"dns-operator-744455d44c-kpss8\" (UID: \"fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.151216 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc8vs\" (UniqueName: \"kubernetes.io/projected/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-kube-api-access-kc8vs\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.151405 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-etcd-service-ca\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.151568 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/05686c29-75d0-4321-8136-39f8600afac1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fp4pg\" (UID: \"05686c29-75d0-4321-8136-39f8600afac1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.151952 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-etcd-service-ca\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.152194 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-oauth-serving-cert\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.152456 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-stats-auth\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.152629 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d71f73a-264b-459b-9d99-100a71204e60-metrics-tls\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.153052 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-registry-tls\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.155700 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d-metrics-tls\") pod \"dns-operator-744455d44c-kpss8\" (UID: \"fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.156671 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3376ab1d-924e-4d39-a393-c4f7a452845b-serving-cert\") pod \"openshift-config-operator-7777fb866f-lsrzs\" (UID: \"3376ab1d-924e-4d39-a393-c4f7a452845b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.156714 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-serving-cert\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.158034 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-oauth-config\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.158655 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6846b045-d258-4dfe-8d50-1bc89b47389c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ltz52\" (UID: \"6846b045-d258-4dfe-8d50-1bc89b47389c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.165559 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cafb7a37-dac6-45f8-9bf9-d8820d7dedaa-proxy-tls\") pod \"machine-config-controller-84d6567774-5m9cc\" (UID: \"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.167665 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/109ab8c4-3dcd-49a9-a966-4ad68758f46a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8lwpw\" (UID: \"109ab8c4-3dcd-49a9-a966-4ad68758f46a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.168622 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2010515f-af73-42cf-a3fd-e2a76508e9e0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.181038 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvrm2\" (UniqueName: \"kubernetes.io/projected/16dbc3a0-2245-4de3-ab11-eb599a11f5fb-kube-api-access-gvrm2\") pod \"router-default-5444994796-pc6rv\" (UID: \"16dbc3a0-2245-4de3-ab11-eb599a11f5fb\") " pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.199447 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm"] Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.203710 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9m92\" (UniqueName: \"kubernetes.io/projected/debaf386-9a77-4774-b7fe-b439a1406621-kube-api-access-r9m92\") pod \"package-server-manager-789f6589d5-sjgwg\" (UID: \"debaf386-9a77-4774-b7fe-b439a1406621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.223021 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r89h4\" (UniqueName: \"kubernetes.io/projected/1e02ba47-76b7-4363-8ade-a9f8c42db920-kube-api-access-r89h4\") pod \"console-f9d7485db-hmlpq\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.248216 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2bf1fe9-f8a4-4adb-a141-45bd491e0268-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vkbl6\" (UID: \"c2bf1fe9-f8a4-4adb-a141-45bd491e0268\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252529 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252769 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kmxl\" (UniqueName: \"kubernetes.io/projected/77dd254d-69a1-4f60-9587-e59f9094626a-kube-api-access-2kmxl\") pod \"service-ca-9c57cc56f-npvk6\" (UID: \"77dd254d-69a1-4f60-9587-e59f9094626a\") " pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252794 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea4607c-59be-4461-8fa4-9e22b87036f8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-29wzh\" (UID: \"4ea4607c-59be-4461-8fa4-9e22b87036f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252816 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25634103-cd17-4005-8c8e-575d58e0826e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xf6r2\" (UID: \"25634103-cd17-4005-8c8e-575d58e0826e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252832 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/52a8a68e-3997-4206-b8df-3650699f2ed7-certs\") pod \"machine-config-server-9qr74\" (UID: \"52a8a68e-3997-4206-b8df-3650699f2ed7\") " pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252851 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2-srv-cert\") pod \"catalog-operator-68c6474976-wcnzt\" (UID: \"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252867 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99b101e0-bca4-40fe-942f-7ab5d06be284-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252882 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a3782b5-7407-485b-a426-58413aba5747-webhook-cert\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252910 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/702301a4-cf98-4786-8649-d6e396c775a4-metrics-tls\") pod \"dns-default-795lz\" (UID: \"702301a4-cf98-4786-8649-d6e396c775a4\") " pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252949 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlxzq\" (UniqueName: \"kubernetes.io/projected/9ae035c8-618a-4a3b-afbc-e8173548cb32-kube-api-access-hlxzq\") pod \"multus-admission-controller-857f4d67dd-bfxvx\" (UID: \"9ae035c8-618a-4a3b-afbc-e8173548cb32\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252967 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb37bc8-4d3f-438c-850d-3b800b786f95-serving-cert\") pod \"service-ca-operator-777779d784-b6pkw\" (UID: \"9fb37bc8-4d3f-438c-850d-3b800b786f95\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.252984 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddzh8\" (UniqueName: \"kubernetes.io/projected/e8ebca0a-96ab-452a-a93e-7041967c40d1-kube-api-access-ddzh8\") pod \"migrator-59844c95c7-x9t4l\" (UID: \"e8ebca0a-96ab-452a-a93e-7041967c40d1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253003 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8mj\" (UniqueName: \"kubernetes.io/projected/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-kube-api-access-zp8mj\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253027 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-mountpoint-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253043 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a3782b5-7407-485b-a426-58413aba5747-tmpfs\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253060 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5smrm\" (UniqueName: \"kubernetes.io/projected/9fb37bc8-4d3f-438c-850d-3b800b786f95-kube-api-access-5smrm\") pod \"service-ca-operator-777779d784-b6pkw\" (UID: \"9fb37bc8-4d3f-438c-850d-3b800b786f95\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253074 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94b36961-ca5f-4fac-a744-118761859a72-cert\") pod \"ingress-canary-8czpq\" (UID: \"94b36961-ca5f-4fac-a744-118761859a72\") " pod="openshift-ingress-canary/ingress-canary-8czpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253092 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ea4607c-59be-4461-8fa4-9e22b87036f8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-29wzh\" (UID: \"4ea4607c-59be-4461-8fa4-9e22b87036f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253135 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4lp\" (UniqueName: \"kubernetes.io/projected/05686c29-75d0-4321-8136-39f8600afac1-kube-api-access-zr4lp\") pod \"olm-operator-6b444d44fb-fp4pg\" (UID: \"05686c29-75d0-4321-8136-39f8600afac1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253150 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/52a8a68e-3997-4206-b8df-3650699f2ed7-node-bootstrap-token\") pod \"machine-config-server-9qr74\" (UID: \"52a8a68e-3997-4206-b8df-3650699f2ed7\") " pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253166 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/77dd254d-69a1-4f60-9587-e59f9094626a-signing-cabundle\") pod \"service-ca-9c57cc56f-npvk6\" (UID: \"77dd254d-69a1-4f60-9587-e59f9094626a\") " pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253184 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rvr8c\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253208 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9ae035c8-618a-4a3b-afbc-e8173548cb32-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bfxvx\" (UID: \"9ae035c8-618a-4a3b-afbc-e8173548cb32\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253231 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z4ft\" (UniqueName: \"kubernetes.io/projected/99b101e0-bca4-40fe-942f-7ab5d06be284-kube-api-access-9z4ft\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253256 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4wb\" (UniqueName: \"kubernetes.io/projected/b3110e75-2479-4c2b-b96b-0f41a1f10cec-kube-api-access-gm4wb\") pod \"marketplace-operator-79b997595-rvr8c\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253278 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99b101e0-bca4-40fe-942f-7ab5d06be284-proxy-tls\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253293 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/385983c4-874d-4095-980c-c2d3763ce8e1-config-volume\") pod \"collect-profiles-29330415-w9ctz\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253309 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-socket-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253323 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g764n\" (UniqueName: \"kubernetes.io/projected/702301a4-cf98-4786-8649-d6e396c775a4-kube-api-access-g764n\") pod \"dns-default-795lz\" (UID: \"702301a4-cf98-4786-8649-d6e396c775a4\") " pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253345 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/05686c29-75d0-4321-8136-39f8600afac1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fp4pg\" (UID: \"05686c29-75d0-4321-8136-39f8600afac1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253370 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a3782b5-7407-485b-a426-58413aba5747-apiservice-cert\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253402 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rvr8c\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253421 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25634103-cd17-4005-8c8e-575d58e0826e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xf6r2\" (UID: \"25634103-cd17-4005-8c8e-575d58e0826e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253436 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26h62\" (UniqueName: \"kubernetes.io/projected/6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2-kube-api-access-26h62\") pod \"catalog-operator-68c6474976-wcnzt\" (UID: \"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253454 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j9tr\" (UniqueName: \"kubernetes.io/projected/94b36961-ca5f-4fac-a744-118761859a72-kube-api-access-8j9tr\") pod \"ingress-canary-8czpq\" (UID: \"94b36961-ca5f-4fac-a744-118761859a72\") " pod="openshift-ingress-canary/ingress-canary-8czpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253473 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-plugins-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253495 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp4ds\" (UniqueName: \"kubernetes.io/projected/4ea4607c-59be-4461-8fa4-9e22b87036f8-kube-api-access-sp4ds\") pod \"kube-storage-version-migrator-operator-b67b599dd-29wzh\" (UID: \"4ea4607c-59be-4461-8fa4-9e22b87036f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253511 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb37bc8-4d3f-438c-850d-3b800b786f95-config\") pod \"service-ca-operator-777779d784-b6pkw\" (UID: \"9fb37bc8-4d3f-438c-850d-3b800b786f95\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253528 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/77dd254d-69a1-4f60-9587-e59f9094626a-signing-key\") pod \"service-ca-9c57cc56f-npvk6\" (UID: \"77dd254d-69a1-4f60-9587-e59f9094626a\") " pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253569 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl4m9\" (UniqueName: \"kubernetes.io/projected/52a8a68e-3997-4206-b8df-3650699f2ed7-kube-api-access-zl4m9\") pod \"machine-config-server-9qr74\" (UID: \"52a8a68e-3997-4206-b8df-3650699f2ed7\") " pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253588 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/05686c29-75d0-4321-8136-39f8600afac1-srv-cert\") pod \"olm-operator-6b444d44fb-fp4pg\" (UID: \"05686c29-75d0-4321-8136-39f8600afac1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253609 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk54x\" (UniqueName: \"kubernetes.io/projected/6a3782b5-7407-485b-a426-58413aba5747-kube-api-access-rk54x\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253654 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-registration-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.253675 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25634103-cd17-4005-8c8e-575d58e0826e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xf6r2\" (UID: \"25634103-cd17-4005-8c8e-575d58e0826e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.255531 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702301a4-cf98-4786-8649-d6e396c775a4-config-volume\") pod \"dns-default-795lz\" (UID: \"702301a4-cf98-4786-8649-d6e396c775a4\") " pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.255671 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2-profile-collector-cert\") pod \"catalog-operator-68c6474976-wcnzt\" (UID: \"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.255734 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99b101e0-bca4-40fe-942f-7ab5d06be284-images\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.255768 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-csi-data-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.255813 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw5xc\" (UniqueName: \"kubernetes.io/projected/385983c4-874d-4095-980c-c2d3763ce8e1-kube-api-access-mw5xc\") pod \"collect-profiles-29330415-w9ctz\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.255834 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/385983c4-874d-4095-980c-c2d3763ce8e1-secret-volume\") pod \"collect-profiles-29330415-w9ctz\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.257688 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rvr8c\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.257859 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:56.757835377 +0000 UTC m=+143.567149521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.259816 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-plugins-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.259936 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-registration-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.268867 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ea4607c-59be-4461-8fa4-9e22b87036f8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-29wzh\" (UID: \"4ea4607c-59be-4461-8fa4-9e22b87036f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.269584 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb37bc8-4d3f-438c-850d-3b800b786f95-config\") pod \"service-ca-operator-777779d784-b6pkw\" (UID: \"9fb37bc8-4d3f-438c-850d-3b800b786f95\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.270994 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/77dd254d-69a1-4f60-9587-e59f9094626a-signing-cabundle\") pod \"service-ca-9c57cc56f-npvk6\" (UID: \"77dd254d-69a1-4f60-9587-e59f9094626a\") " pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.271789 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/05686c29-75d0-4321-8136-39f8600afac1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fp4pg\" (UID: \"05686c29-75d0-4321-8136-39f8600afac1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.272846 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/52a8a68e-3997-4206-b8df-3650699f2ed7-certs\") pod \"machine-config-server-9qr74\" (UID: \"52a8a68e-3997-4206-b8df-3650699f2ed7\") " pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.273579 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2-srv-cert\") pod \"catalog-operator-68c6474976-wcnzt\" (UID: \"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.274391 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25634103-cd17-4005-8c8e-575d58e0826e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xf6r2\" (UID: \"25634103-cd17-4005-8c8e-575d58e0826e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.274505 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a3782b5-7407-485b-a426-58413aba5747-webhook-cert\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.274578 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99b101e0-bca4-40fe-942f-7ab5d06be284-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.274953 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea4607c-59be-4461-8fa4-9e22b87036f8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-29wzh\" (UID: \"4ea4607c-59be-4461-8fa4-9e22b87036f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.275110 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/385983c4-874d-4095-980c-c2d3763ce8e1-secret-volume\") pod \"collect-profiles-29330415-w9ctz\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.275171 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702301a4-cf98-4786-8649-d6e396c775a4-config-volume\") pod \"dns-default-795lz\" (UID: \"702301a4-cf98-4786-8649-d6e396c775a4\") " pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.276067 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/702301a4-cf98-4786-8649-d6e396c775a4-metrics-tls\") pod \"dns-default-795lz\" (UID: \"702301a4-cf98-4786-8649-d6e396c775a4\") " pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.276125 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxscm\" (UniqueName: \"kubernetes.io/projected/a31617bb-030b-48d7-b312-e7af9b052143-kube-api-access-pxscm\") pod \"downloads-7954f5f757-hd4xd\" (UID: \"a31617bb-030b-48d7-b312-e7af9b052143\") " pod="openshift-console/downloads-7954f5f757-hd4xd" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.276921 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a3782b5-7407-485b-a426-58413aba5747-apiservice-cert\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.277325 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9ae035c8-618a-4a3b-afbc-e8173548cb32-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bfxvx\" (UID: \"9ae035c8-618a-4a3b-afbc-e8173548cb32\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.277674 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2-profile-collector-cert\") pod \"catalog-operator-68c6474976-wcnzt\" (UID: \"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.277848 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/77dd254d-69a1-4f60-9587-e59f9094626a-signing-key\") pod \"service-ca-9c57cc56f-npvk6\" (UID: \"77dd254d-69a1-4f60-9587-e59f9094626a\") " pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.278172 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a3782b5-7407-485b-a426-58413aba5747-tmpfs\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.278212 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99b101e0-bca4-40fe-942f-7ab5d06be284-images\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.278318 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-csi-data-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.278573 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/52a8a68e-3997-4206-b8df-3650699f2ed7-node-bootstrap-token\") pod \"machine-config-server-9qr74\" (UID: \"52a8a68e-3997-4206-b8df-3650699f2ed7\") " pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.281777 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-socket-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.282649 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-mountpoint-dir\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.283595 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/385983c4-874d-4095-980c-c2d3763ce8e1-config-volume\") pod \"collect-profiles-29330415-w9ctz\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.285745 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94b36961-ca5f-4fac-a744-118761859a72-cert\") pod \"ingress-canary-8czpq\" (UID: \"94b36961-ca5f-4fac-a744-118761859a72\") " pod="openshift-ingress-canary/ingress-canary-8czpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.287652 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb37bc8-4d3f-438c-850d-3b800b786f95-serving-cert\") pod \"service-ca-operator-777779d784-b6pkw\" (UID: \"9fb37bc8-4d3f-438c-850d-3b800b786f95\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.295585 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99b101e0-bca4-40fe-942f-7ab5d06be284-proxy-tls\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.296061 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/05686c29-75d0-4321-8136-39f8600afac1-srv-cert\") pod \"olm-operator-6b444d44fb-fp4pg\" (UID: \"05686c29-75d0-4321-8136-39f8600afac1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.296276 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25634103-cd17-4005-8c8e-575d58e0826e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xf6r2\" (UID: \"25634103-cd17-4005-8c8e-575d58e0826e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.301054 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7krj\" (UniqueName: \"kubernetes.io/projected/cafb7a37-dac6-45f8-9bf9-d8820d7dedaa-kube-api-access-d7krj\") pod \"machine-config-controller-84d6567774-5m9cc\" (UID: \"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.301988 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rvr8c\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.310757 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89lbm\" (UniqueName: \"kubernetes.io/projected/fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d-kube-api-access-89lbm\") pod \"dns-operator-744455d44c-kpss8\" (UID: \"fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.327703 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztz47\" (UniqueName: \"kubernetes.io/projected/16df6e82-291b-4885-86a8-50276b3e7ef2-kube-api-access-ztz47\") pod \"console-operator-58897d9998-2ttnw\" (UID: \"16df6e82-291b-4885-86a8-50276b3e7ef2\") " pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.339372 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6846b045-d258-4dfe-8d50-1bc89b47389c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ltz52\" (UID: \"6846b045-d258-4dfe-8d50-1bc89b47389c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.363757 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.363881 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf"] Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.364091 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:56.86407177 +0000 UTC m=+143.673385914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.374312 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsmb2\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-kube-api-access-gsmb2\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.375341 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.382148 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pz9g\" (UniqueName: \"kubernetes.io/projected/2010515f-af73-42cf-a3fd-e2a76508e9e0-kube-api-access-8pz9g\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.402416 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk4t9\" (UniqueName: \"kubernetes.io/projected/3376ab1d-924e-4d39-a393-c4f7a452845b-kube-api-access-fk4t9\") pod \"openshift-config-operator-7777fb866f-lsrzs\" (UID: \"3376ab1d-924e-4d39-a393-c4f7a452845b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.402841 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.422490 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.425310 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-bound-sa-token\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.432180 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hd4xd" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.441586 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d71f73a-264b-459b-9d99-100a71204e60-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.449021 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.454948 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.462617 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.463415 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2010515f-af73-42cf-a3fd-e2a76508e9e0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q57jq\" (UID: \"2010515f-af73-42cf-a3fd-e2a76508e9e0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.464812 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.465048 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:56.965015432 +0000 UTC m=+143.774329566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.465997 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.466364 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:56.966355446 +0000 UTC m=+143.775669590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.480303 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.481274 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxlfz\" (UniqueName: \"kubernetes.io/projected/109ab8c4-3dcd-49a9-a966-4ad68758f46a-kube-api-access-zxlfz\") pod \"control-plane-machine-set-operator-78cbb6b69f-8lwpw\" (UID: \"109ab8c4-3dcd-49a9-a966-4ad68758f46a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.486521 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.500222 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fbrj\" (UniqueName: \"kubernetes.io/projected/0f09299f-24a4-4d5f-8ca9-704c678b8d23-kube-api-access-9fbrj\") pod \"controller-manager-879f6c89f-2xfws\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.510404 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" Oct 07 08:18:56 crc kubenswrapper[5025]: W1007 08:18:56.514525 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16dbc3a0_2245_4de3_ab11_eb599a11f5fb.slice/crio-bff26475d5fedc6e551177b7b1a0fdde6c9c447543df3522fdce8206b92c9fa5 WatchSource:0}: Error finding container bff26475d5fedc6e551177b7b1a0fdde6c9c447543df3522fdce8206b92c9fa5: Status 404 returned error can't find the container with id bff26475d5fedc6e551177b7b1a0fdde6c9c447543df3522fdce8206b92c9fa5 Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.529784 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnnf\" (UniqueName: \"kubernetes.io/projected/9d71f73a-264b-459b-9d99-100a71204e60-kube-api-access-nhnnf\") pod \"ingress-operator-5b745b69d9-z8xqx\" (UID: \"9d71f73a-264b-459b-9d99-100a71204e60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.552686 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc8vs\" (UniqueName: \"kubernetes.io/projected/e63d4e4f-5fe5-43b2-a7af-8893da0ed732-kube-api-access-kc8vs\") pod \"etcd-operator-b45778765-h4xpz\" (UID: \"e63d4e4f-5fe5-43b2-a7af-8893da0ed732\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.566569 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.566828 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.066809762 +0000 UTC m=+143.876123906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.566879 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.568012 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.0679957 +0000 UTC m=+143.877309844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.598582 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kmxl\" (UniqueName: \"kubernetes.io/projected/77dd254d-69a1-4f60-9587-e59f9094626a-kube-api-access-2kmxl\") pod \"service-ca-9c57cc56f-npvk6\" (UID: \"77dd254d-69a1-4f60-9587-e59f9094626a\") " pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.600763 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddzh8\" (UniqueName: \"kubernetes.io/projected/e8ebca0a-96ab-452a-a93e-7041967c40d1-kube-api-access-ddzh8\") pod \"migrator-59844c95c7-x9t4l\" (UID: \"e8ebca0a-96ab-452a-a93e-7041967c40d1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.626434 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25634103-cd17-4005-8c8e-575d58e0826e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xf6r2\" (UID: \"25634103-cd17-4005-8c8e-575d58e0826e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.649121 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g764n\" (UniqueName: \"kubernetes.io/projected/702301a4-cf98-4786-8649-d6e396c775a4-kube-api-access-g764n\") pod \"dns-default-795lz\" (UID: \"702301a4-cf98-4786-8649-d6e396c775a4\") " pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.672579 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl4m9\" (UniqueName: \"kubernetes.io/projected/52a8a68e-3997-4206-b8df-3650699f2ed7-kube-api-access-zl4m9\") pod \"machine-config-server-9qr74\" (UID: \"52a8a68e-3997-4206-b8df-3650699f2ed7\") " pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.678174 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.678706 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.680353 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.180305668 +0000 UTC m=+143.989619812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.686028 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.694365 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.705070 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4lp\" (UniqueName: \"kubernetes.io/projected/05686c29-75d0-4321-8136-39f8600afac1-kube-api-access-zr4lp\") pod \"olm-operator-6b444d44fb-fp4pg\" (UID: \"05686c29-75d0-4321-8136-39f8600afac1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.713814 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlxzq\" (UniqueName: \"kubernetes.io/projected/9ae035c8-618a-4a3b-afbc-e8173548cb32-kube-api-access-hlxzq\") pod \"multus-admission-controller-857f4d67dd-bfxvx\" (UID: \"9ae035c8-618a-4a3b-afbc-e8173548cb32\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.716732 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.725710 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z4ft\" (UniqueName: \"kubernetes.io/projected/99b101e0-bca4-40fe-942f-7ab5d06be284-kube-api-access-9z4ft\") pod \"machine-config-operator-74547568cd-xwd6l\" (UID: \"99b101e0-bca4-40fe-942f-7ab5d06be284\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.749528 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp4ds\" (UniqueName: \"kubernetes.io/projected/4ea4607c-59be-4461-8fa4-9e22b87036f8-kube-api-access-sp4ds\") pod \"kube-storage-version-migrator-operator-b67b599dd-29wzh\" (UID: \"4ea4607c-59be-4461-8fa4-9e22b87036f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.755448 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kpss8"] Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.766980 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk54x\" (UniqueName: \"kubernetes.io/projected/6a3782b5-7407-485b-a426-58413aba5747-kube-api-access-rk54x\") pod \"packageserver-d55dfcdfc-qzlm7\" (UID: \"6a3782b5-7407-485b-a426-58413aba5747\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.771378 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.782382 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.782931 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.282913934 +0000 UTC m=+144.092228078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.784261 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" event={"ID":"52da690b-1ff6-4fb3-8c49-71a1fba78754","Type":"ContainerStarted","Data":"c9bba87df67a3fca0372a96555a0f33c0c1d0baa8d752db115a4f51cb0995d48"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.784297 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" event={"ID":"52da690b-1ff6-4fb3-8c49-71a1fba78754","Type":"ContainerStarted","Data":"e30cd32e7a928737d59a4d3d5a71f47f9b2db357b2faaab09c51b8be63ba7dcd"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.786304 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26h62\" (UniqueName: \"kubernetes.io/projected/6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2-kube-api-access-26h62\") pod \"catalog-operator-68c6474976-wcnzt\" (UID: \"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.795797 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.806691 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.809205 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.809413 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j9tr\" (UniqueName: \"kubernetes.io/projected/94b36961-ca5f-4fac-a744-118761859a72-kube-api-access-8j9tr\") pod \"ingress-canary-8czpq\" (UID: \"94b36961-ca5f-4fac-a744-118761859a72\") " pod="openshift-ingress-canary/ingress-canary-8czpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.816470 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.826025 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8mj\" (UniqueName: \"kubernetes.io/projected/784ef07a-3cbd-46f4-9ce5-90c1a6d75edc-kube-api-access-zp8mj\") pod \"csi-hostpathplugin-jb77m\" (UID: \"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc\") " pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.829811 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.830785 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" event={"ID":"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce","Type":"ContainerStarted","Data":"6031f54e04649dc8ec9e1b361c1be166a1c3209069f592c3d6721bbe59e0d7f8"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.830832 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" event={"ID":"cc6ec75d-41ec-4966-ab3c-03cf9f2497ce","Type":"ContainerStarted","Data":"db4db9027179c86ee1fee5222948ee0a8c161c06e2919c325458fc473da3237e"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.835263 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" event={"ID":"1e23d4b6-3f5b-4288-8753-cff09258a821","Type":"ContainerStarted","Data":"9f1dcf45085f89daa3dfd29a54ee754d72b11cf2961b24b1da52854086b6d8bf"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.835315 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" event={"ID":"1e23d4b6-3f5b-4288-8753-cff09258a821","Type":"ContainerStarted","Data":"70ffe3cf4b4bd25efbcdfec4cef77e2027e731010c87ee70e8c13b3e040f7120"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.835326 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" event={"ID":"1e23d4b6-3f5b-4288-8753-cff09258a821","Type":"ContainerStarted","Data":"e2741dfb2b2e9e6903ec7ae8c76c7688289b0c79ce0118f78196e01917809795"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.842239 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.843472 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw5xc\" (UniqueName: \"kubernetes.io/projected/385983c4-874d-4095-980c-c2d3763ce8e1-kube-api-access-mw5xc\") pod \"collect-profiles-29330415-w9ctz\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.848966 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52"] Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.852323 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.861329 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.868426 5025 generic.go:334] "Generic (PLEG): container finished" podID="66353c4f-6d67-4155-8b97-5f27145eabdd" containerID="cb72db964b7012c2f5f213d8ecdc98432e4c2e376e21b076a8561760d4bc90d4" exitCode=0 Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.868468 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" event={"ID":"66353c4f-6d67-4155-8b97-5f27145eabdd","Type":"ContainerDied","Data":"cb72db964b7012c2f5f213d8ecdc98432e4c2e376e21b076a8561760d4bc90d4"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.868514 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" event={"ID":"66353c4f-6d67-4155-8b97-5f27145eabdd","Type":"ContainerStarted","Data":"de375ec75f7dab0814d500873f363dfb8850df59605c15fa0c083aab87134811"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.870792 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4wb\" (UniqueName: \"kubernetes.io/projected/b3110e75-2479-4c2b-b96b-0f41a1f10cec-kube-api-access-gm4wb\") pod \"marketplace-operator-79b997595-rvr8c\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.872222 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" event={"ID":"151c8409-e9e6-4a48-8e3d-661c0498cd86","Type":"ContainerStarted","Data":"e51ef67a7cc46a6413284aff020c11694db3291d6713669580258ce5692234da"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.886197 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" event={"ID":"6dca6d86-e6aa-455e-88e8-f61e829b7efd","Type":"ContainerStarted","Data":"e936f73aaf0e98b82a887ed5aca95a08e95ac4d1f0ac49da7719f5275323b888"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.886244 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" event={"ID":"6dca6d86-e6aa-455e-88e8-f61e829b7efd","Type":"ContainerStarted","Data":"184acddeddb7db04ff8b78422aabc067026fb8a3c545381b06bdc6803c302a0e"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.886516 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.888162 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:18:56 crc kubenswrapper[5025]: E1007 08:18:56.888314 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.38829253 +0000 UTC m=+144.197606674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.888516 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.888924 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.890265 5025 generic.go:334] "Generic (PLEG): container finished" podID="ad008aba-a6da-410f-82b5-a18f8dd3d5c7" containerID="bc5a438d25a8d9e1153069c73193062e71febba534ef6f4a4accc9a6d7f3b028" exitCode=0 Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.890317 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" event={"ID":"ad008aba-a6da-410f-82b5-a18f8dd3d5c7","Type":"ContainerDied","Data":"bc5a438d25a8d9e1153069c73193062e71febba534ef6f4a4accc9a6d7f3b028"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.890340 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" event={"ID":"ad008aba-a6da-410f-82b5-a18f8dd3d5c7","Type":"ContainerStarted","Data":"a47f53846475bf0264769b489b7ab913854463c7bc9c8bd1478e3353eb409409"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.892734 5025 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-vk2xm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.892838 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" podUID="6dca6d86-e6aa-455e-88e8-f61e829b7efd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.895367 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pc6rv" event={"ID":"16dbc3a0-2245-4de3-ab11-eb599a11f5fb","Type":"ContainerStarted","Data":"bff26475d5fedc6e551177b7b1a0fdde6c9c447543df3522fdce8206b92c9fa5"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.898532 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jb77m" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.902138 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" event={"ID":"7b0a663d-08c8-4198-a334-701d58beee58","Type":"ContainerStarted","Data":"05b3efce128b56a0c902de55e3ddc6fad5f79d3d9c7e700d5e53d4b3033a6754"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.902211 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" event={"ID":"7b0a663d-08c8-4198-a334-701d58beee58","Type":"ContainerStarted","Data":"3d4822c8ebdca6ba7267684a129f1cc0d3e65d3a41fe7937b4bd5a6b28fc9064"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.902333 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smrm\" (UniqueName: \"kubernetes.io/projected/9fb37bc8-4d3f-438c-850d-3b800b786f95-kube-api-access-5smrm\") pod \"service-ca-operator-777779d784-b6pkw\" (UID: \"9fb37bc8-4d3f-438c-850d-3b800b786f95\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.904060 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" event={"ID":"e6aa9841-3d4c-4600-9763-d32c243fa4d8","Type":"ContainerStarted","Data":"ddea584281041b32152198f51c01b02d3bcafab7c05b83a1abcebb5f97e75f0d"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.907994 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9qr74" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.908038 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" event={"ID":"49ac5730-6f35-4a6b-9c54-df81528bdb81","Type":"ContainerStarted","Data":"267c18c13c829b2681d31a51c2227e40c1449a409c5890de3d3cc74d7af141a9"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.908091 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" event={"ID":"49ac5730-6f35-4a6b-9c54-df81528bdb81","Type":"ContainerStarted","Data":"fa32bdd59b20b65f4380e9781f7c1e78cfec788346b2e84e19638d0a6548199c"} Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.915046 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8czpq" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.921442 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-795lz" Oct 07 08:18:56 crc kubenswrapper[5025]: I1007 08:18:56.996356 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.005270 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.505252288 +0000 UTC m=+144.314566422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.109740 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.110296 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.610278093 +0000 UTC m=+144.419592237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.114945 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hd4xd"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.120523 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2ttnw"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.120893 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.146700 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hmlpq"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.166669 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.211531 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.212321 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.712294069 +0000 UTC m=+144.521608213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: W1007 08:18:57.262834 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16df6e82_291b_4885_86a8_50276b3e7ef2.slice/crio-b986c10b7031fd89ef043f16587a1a300e9902466c6e003843c22c4c2fba3517 WatchSource:0}: Error finding container b986c10b7031fd89ef043f16587a1a300e9902466c6e003843c22c4c2fba3517: Status 404 returned error can't find the container with id b986c10b7031fd89ef043f16587a1a300e9902466c6e003843c22c4c2fba3517 Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.275120 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.314719 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.315103 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.815086391 +0000 UTC m=+144.624400535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.316697 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.318381 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw"] Oct 07 08:18:57 crc kubenswrapper[5025]: W1007 08:18:57.364110 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52a8a68e_3997_4206_b8df_3650699f2ed7.slice/crio-99a362fef444bd71e6aa84ff837729b8beea8e8fdcd69d8d73af85689ceda906 WatchSource:0}: Error finding container 99a362fef444bd71e6aa84ff837729b8beea8e8fdcd69d8d73af85689ceda906: Status 404 returned error can't find the container with id 99a362fef444bd71e6aa84ff837729b8beea8e8fdcd69d8d73af85689ceda906 Oct 07 08:18:57 crc kubenswrapper[5025]: W1007 08:18:57.390733 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod109ab8c4_3dcd_49a9_a966_4ad68758f46a.slice/crio-fc18fb1b09ca474fe31e11898b34a60671d3a7f14981c5aa0db437345a8f9baa WatchSource:0}: Error finding container fc18fb1b09ca474fe31e11898b34a60671d3a7f14981c5aa0db437345a8f9baa: Status 404 returned error can't find the container with id fc18fb1b09ca474fe31e11898b34a60671d3a7f14981c5aa0db437345a8f9baa Oct 07 08:18:57 crc kubenswrapper[5025]: W1007 08:18:57.412836 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2bf1fe9_f8a4_4adb_a141_45bd491e0268.slice/crio-410c5433f112e919f0ea484aebdf80c279cdb53669814b0ec6a247e1b19226a1 WatchSource:0}: Error finding container 410c5433f112e919f0ea484aebdf80c279cdb53669814b0ec6a247e1b19226a1: Status 404 returned error can't find the container with id 410c5433f112e919f0ea484aebdf80c279cdb53669814b0ec6a247e1b19226a1 Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.417372 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.417809 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:57.9177906 +0000 UTC m=+144.727104744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.483147 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h4xpz"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.490472 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.518209 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.518616 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.018600729 +0000 UTC m=+144.827914873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.622815 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.623484 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs"] Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.624494 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.124476599 +0000 UTC m=+144.933790743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.629764 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2xfws"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.629840 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq"] Oct 07 08:18:57 crc kubenswrapper[5025]: W1007 08:18:57.630315 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode63d4e4f_5fe5_43b2_a7af_8893da0ed732.slice/crio-bee4980a0b48ec89c1476761f47796b1154e0a4a4f759887aa215f9a658f5625 WatchSource:0}: Error finding container bee4980a0b48ec89c1476761f47796b1154e0a4a4f759887aa215f9a658f5625: Status 404 returned error can't find the container with id bee4980a0b48ec89c1476761f47796b1154e0a4a4f759887aa215f9a658f5625 Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.725736 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.726326 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.22628875 +0000 UTC m=+145.035602894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.827736 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.829432 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.329410063 +0000 UTC m=+145.138724207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.907468 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.909527 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.929080 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:57 crc kubenswrapper[5025]: E1007 08:18:57.929300 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.42926895 +0000 UTC m=+145.238583094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.973496 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hmlpq" event={"ID":"1e02ba47-76b7-4363-8ade-a9f8c42db920","Type":"ContainerStarted","Data":"4b692ce9d525bf8211568b36403ae73b1b9255c173388434978cc8130ca4f269"} Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.974713 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l"] Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.976429 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" event={"ID":"e63d4e4f-5fe5-43b2-a7af-8893da0ed732","Type":"ContainerStarted","Data":"bee4980a0b48ec89c1476761f47796b1154e0a4a4f759887aa215f9a658f5625"} Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.978904 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" event={"ID":"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa","Type":"ContainerStarted","Data":"417966ed4feaed4f4c9c8ce7dfc500b1d21ec92eeb86bdc3627eb8c6fa4bd587"} Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.982662 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" event={"ID":"151c8409-e9e6-4a48-8e3d-661c0498cd86","Type":"ContainerStarted","Data":"9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c"} Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.983798 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.984240 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" event={"ID":"3376ab1d-924e-4d39-a393-c4f7a452845b","Type":"ContainerStarted","Data":"85efd1acffc3de453605eb87384ee9a120c692674502f220b1cce39e8e497c1c"} Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.985192 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" event={"ID":"debaf386-9a77-4774-b7fe-b439a1406621","Type":"ContainerStarted","Data":"b71500167e097a57b1bf735f097a00d87e3e260a47cc2e97b7988be2ddfd235d"} Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.993778 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" event={"ID":"e6aa9841-3d4c-4600-9763-d32c243fa4d8","Type":"ContainerStarted","Data":"c935fc6b8f065b26834c787148d59b16159f812ada71965edf564dcc4f1c64fe"} Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.993847 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" event={"ID":"e6aa9841-3d4c-4600-9763-d32c243fa4d8","Type":"ContainerStarted","Data":"1197ed173bdd85ae5d5bab5cc09575d020bb76bfd6fe06cb9af09f4ce0140fde"} Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.993990 5025 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hpfzj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.994035 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" podUID="151c8409-e9e6-4a48-8e3d-661c0498cd86" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Oct 07 08:18:57 crc kubenswrapper[5025]: I1007 08:18:57.996576 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" event={"ID":"2010515f-af73-42cf-a3fd-e2a76508e9e0","Type":"ContainerStarted","Data":"62c23833119432ded0e3b2572c012bdf9f071865a8144b75431b97425a9a2233"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.014967 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" event={"ID":"ad008aba-a6da-410f-82b5-a18f8dd3d5c7","Type":"ContainerStarted","Data":"460863a3dddcb64f29e86568cdd483be3ccea593e59e5d057b85c7044698983a"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.022697 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" event={"ID":"109ab8c4-3dcd-49a9-a966-4ad68758f46a","Type":"ContainerStarted","Data":"fc18fb1b09ca474fe31e11898b34a60671d3a7f14981c5aa0db437345a8f9baa"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.024712 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9qr74" event={"ID":"52a8a68e-3997-4206-b8df-3650699f2ed7","Type":"ContainerStarted","Data":"99a362fef444bd71e6aa84ff837729b8beea8e8fdcd69d8d73af85689ceda906"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.032117 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.035725 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.535690369 +0000 UTC m=+145.345004513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.058025 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" event={"ID":"fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d","Type":"ContainerStarted","Data":"d2375f43f8ee71905f797bcf4e39583a3d0b1b6f0de31e97a88f951db36f0e2c"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.068187 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2ttnw" event={"ID":"16df6e82-291b-4885-86a8-50276b3e7ef2","Type":"ContainerStarted","Data":"b986c10b7031fd89ef043f16587a1a300e9902466c6e003843c22c4c2fba3517"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.078048 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" event={"ID":"0f09299f-24a4-4d5f-8ca9-704c678b8d23","Type":"ContainerStarted","Data":"2d1aa061bc3e8407b74c12f6d52d62e36361948c9bf5e475cd844d562888e12d"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.106016 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" event={"ID":"7b0a663d-08c8-4198-a334-701d58beee58","Type":"ContainerStarted","Data":"6f8bffe042c8b06bac134bf4baa5b892b536fe25ec84b4cab0a92f937e4f03a6"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.128876 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" event={"ID":"c2bf1fe9-f8a4-4adb-a141-45bd491e0268","Type":"ContainerStarted","Data":"410c5433f112e919f0ea484aebdf80c279cdb53669814b0ec6a247e1b19226a1"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.133984 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.136256 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.636236999 +0000 UTC m=+145.445551133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.182087 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pc6rv" event={"ID":"16dbc3a0-2245-4de3-ab11-eb599a11f5fb","Type":"ContainerStarted","Data":"636cece52690feef3131602cbe0e323ed668ceb7d0fadc4620a0d16da49bdd1f"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.185323 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hd4xd" event={"ID":"a31617bb-030b-48d7-b312-e7af9b052143","Type":"ContainerStarted","Data":"40dbdb19fde35a0329f4e0d3bffbeb246b23a93b5ba4fa8e892c024730928352"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.193345 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" event={"ID":"6846b045-d258-4dfe-8d50-1bc89b47389c","Type":"ContainerStarted","Data":"b828e8940cf6a9717170cb5f8f80810b422bb00cacf40b46cce631b450ee4812"} Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.220679 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.249579 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.253053 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.753017201 +0000 UTC m=+145.562331335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.354059 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.354791 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.85476243 +0000 UTC m=+145.664076574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.362928 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" podStartSLOduration=124.362895891 podStartE2EDuration="2m4.362895891s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.349212001 +0000 UTC m=+145.158526145" watchObservedRunningTime="2025-10-07 08:18:58.362895891 +0000 UTC m=+145.172210035" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.391750 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbpgk" podStartSLOduration=124.39172436 podStartE2EDuration="2m4.39172436s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.390658395 +0000 UTC m=+145.199972539" watchObservedRunningTime="2025-10-07 08:18:58.39172436 +0000 UTC m=+145.201038504" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.407456 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.457573 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.462180 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:58.96216062 +0000 UTC m=+145.771474754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.467438 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.489777 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:18:58 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:18:58 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:18:58 crc kubenswrapper[5025]: healthz check failed Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.489862 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.518496 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pc6rv" podStartSLOduration=124.518466513 podStartE2EDuration="2m4.518466513s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.516948175 +0000 UTC m=+145.326262329" watchObservedRunningTime="2025-10-07 08:18:58.518466513 +0000 UTC m=+145.327780657" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.552886 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cxbbt" podStartSLOduration=124.552869822 podStartE2EDuration="2m4.552869822s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.552242282 +0000 UTC m=+145.361556436" watchObservedRunningTime="2025-10-07 08:18:58.552869822 +0000 UTC m=+145.362183966" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.559737 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.560505 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.060392714 +0000 UTC m=+145.869706858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.570453 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.571966 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.071945257 +0000 UTC m=+145.881259401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.604361 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" podStartSLOduration=124.604347221 podStartE2EDuration="2m4.604347221s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.597227451 +0000 UTC m=+145.406541595" watchObservedRunningTime="2025-10-07 08:18:58.604347221 +0000 UTC m=+145.413661365" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.647572 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bfxvx"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.657852 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.660380 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" podStartSLOduration=124.660359126 podStartE2EDuration="2m4.660359126s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.64063344 +0000 UTC m=+145.449947584" watchObservedRunningTime="2025-10-07 08:18:58.660359126 +0000 UTC m=+145.469673270" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.672101 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.672463 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.172448465 +0000 UTC m=+145.981762609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.684872 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hmvqm" podStartSLOduration=124.684841245 podStartE2EDuration="2m4.684841245s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.682868791 +0000 UTC m=+145.492182945" watchObservedRunningTime="2025-10-07 08:18:58.684841245 +0000 UTC m=+145.494155389" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.699466 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.704816 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.710866 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-n49bl" podStartSLOduration=124.710847642 podStartE2EDuration="2m4.710847642s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.710329215 +0000 UTC m=+145.519643359" watchObservedRunningTime="2025-10-07 08:18:58.710847642 +0000 UTC m=+145.520161786" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.724505 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.726137 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-npvk6"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.740439 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-795lz"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.742944 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8czpq"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.773279 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.774039 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.274026088 +0000 UTC m=+146.083340232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.780482 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ck97s" podStartSLOduration=124.780462585 podStartE2EDuration="2m4.780462585s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.751907645 +0000 UTC m=+145.561221789" watchObservedRunningTime="2025-10-07 08:18:58.780462585 +0000 UTC m=+145.589776729" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.783095 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvr8c"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.788918 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lrfqf" podStartSLOduration=124.788894367 podStartE2EDuration="2m4.788894367s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:58.772824459 +0000 UTC m=+145.582138603" watchObservedRunningTime="2025-10-07 08:18:58.788894367 +0000 UTC m=+145.598208561" Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.835494 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.850343 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jb77m"] Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.875038 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.875422 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.375403694 +0000 UTC m=+146.184717838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.878560 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz"] Oct 07 08:18:58 crc kubenswrapper[5025]: W1007 08:18:58.889862 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3110e75_2479_4c2b_b96b_0f41a1f10cec.slice/crio-0c5c4416c983a5db6065c48873853936c3a1a77933de02ab95e21a090c5d5c3c WatchSource:0}: Error finding container 0c5c4416c983a5db6065c48873853936c3a1a77933de02ab95e21a090c5d5c3c: Status 404 returned error can't find the container with id 0c5c4416c983a5db6065c48873853936c3a1a77933de02ab95e21a090c5d5c3c Oct 07 08:18:58 crc kubenswrapper[5025]: W1007 08:18:58.893054 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a3782b5_7407_485b_a426_58413aba5747.slice/crio-39acabc4a0203520c6929f6b3a6279fd7798be4ff9f4919ca003235251bb84ff WatchSource:0}: Error finding container 39acabc4a0203520c6929f6b3a6279fd7798be4ff9f4919ca003235251bb84ff: Status 404 returned error can't find the container with id 39acabc4a0203520c6929f6b3a6279fd7798be4ff9f4919ca003235251bb84ff Oct 07 08:18:58 crc kubenswrapper[5025]: W1007 08:18:58.931994 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77dd254d_69a1_4f60_9587_e59f9094626a.slice/crio-a69b7a6e0167a890ac08e216cf07feffc10439ab4a52b1ff3056a6213b22fe8c WatchSource:0}: Error finding container a69b7a6e0167a890ac08e216cf07feffc10439ab4a52b1ff3056a6213b22fe8c: Status 404 returned error can't find the container with id a69b7a6e0167a890ac08e216cf07feffc10439ab4a52b1ff3056a6213b22fe8c Oct 07 08:18:58 crc kubenswrapper[5025]: I1007 08:18:58.977051 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:58 crc kubenswrapper[5025]: E1007 08:18:58.978016 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.477966138 +0000 UTC m=+146.287280282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.078595 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.079248 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.579230971 +0000 UTC m=+146.388545115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.189667 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.191720 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.691696785 +0000 UTC m=+146.501010949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.204977 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" event={"ID":"4ea4607c-59be-4461-8fa4-9e22b87036f8","Type":"ContainerStarted","Data":"2884470c5f250c254798d26fc2ff0640836f955f9ab88de3029ff47705ae0d8d"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.225487 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" event={"ID":"05686c29-75d0-4321-8136-39f8600afac1","Type":"ContainerStarted","Data":"139dc78b7ffd03cd4cac601012c96093197028a4dca2f12437cdf2c40203cc91"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.225607 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" event={"ID":"05686c29-75d0-4321-8136-39f8600afac1","Type":"ContainerStarted","Data":"ee9192e80adec32a3b6652327bc6cf9c6c04a9c7e2d6fc3142c1ff283729536c"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.226861 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.232706 5025 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-fp4pg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.232787 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" podUID="05686c29-75d0-4321-8136-39f8600afac1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.240737 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" event={"ID":"9ae035c8-618a-4a3b-afbc-e8173548cb32","Type":"ContainerStarted","Data":"431799c64423f3c798f10f31727c5ff336b03b169f25df0af04e2d2308d170b3"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.254593 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" podStartSLOduration=125.254576871 podStartE2EDuration="2m5.254576871s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.252904737 +0000 UTC m=+146.062218891" watchObservedRunningTime="2025-10-07 08:18:59.254576871 +0000 UTC m=+146.063891015" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.255209 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" event={"ID":"0f09299f-24a4-4d5f-8ca9-704c678b8d23","Type":"ContainerStarted","Data":"f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.255840 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.260419 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" event={"ID":"6846b045-d258-4dfe-8d50-1bc89b47389c","Type":"ContainerStarted","Data":"248335410bedd5870baa41c1c5937c8e2616b9bbb318aa9e0a819943ae3edb3d"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.261358 5025 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-2xfws container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.261403 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" podUID="0f09299f-24a4-4d5f-8ca9-704c678b8d23" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.267115 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" event={"ID":"25634103-cd17-4005-8c8e-575d58e0826e","Type":"ContainerStarted","Data":"811ad3eb96412596a81a80708cb9b6df4b6a2b204384d314399f89c47b982cd5"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.269586 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" event={"ID":"b3110e75-2479-4c2b-b96b-0f41a1f10cec","Type":"ContainerStarted","Data":"0c5c4416c983a5db6065c48873853936c3a1a77933de02ab95e21a090c5d5c3c"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.280910 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" event={"ID":"c2bf1fe9-f8a4-4adb-a141-45bd491e0268","Type":"ContainerStarted","Data":"4e18cb84057d1784f00d9794c61fe4d47edc35c0c00d4ef580699025688488f1"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.282099 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" podStartSLOduration=125.282089717 podStartE2EDuration="2m5.282089717s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.281960694 +0000 UTC m=+146.091274828" watchObservedRunningTime="2025-10-07 08:18:59.282089717 +0000 UTC m=+146.091403861" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.291180 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jb77m" event={"ID":"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc","Type":"ContainerStarted","Data":"99450a72b1209d9d757983abad53a5c4c903df5ae3d09d439a0413f8fdf6b2aa"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.293219 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.301567 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.801529464 +0000 UTC m=+146.610843618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.312090 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vkbl6" podStartSLOduration=125.312071923 podStartE2EDuration="2m5.312071923s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.310004117 +0000 UTC m=+146.119318261" watchObservedRunningTime="2025-10-07 08:18:59.312071923 +0000 UTC m=+146.121386067" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.335178 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" event={"ID":"99b101e0-bca4-40fe-942f-7ab5d06be284","Type":"ContainerStarted","Data":"032d394ae613db9fe52ff0e7e50aa9510119b970e51419bb05e25d5365062254"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.335247 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" event={"ID":"99b101e0-bca4-40fe-942f-7ab5d06be284","Type":"ContainerStarted","Data":"7eb013427793f7bbb9d719aa1508ae96da082d755a50ba5e1d000be65a8f5dbd"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.345799 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ltz52" podStartSLOduration=125.34578603 podStartE2EDuration="2m5.34578603s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.344654373 +0000 UTC m=+146.153968527" watchObservedRunningTime="2025-10-07 08:18:59.34578603 +0000 UTC m=+146.155100174" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.347415 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" event={"ID":"6a3782b5-7407-485b-a426-58413aba5747","Type":"ContainerStarted","Data":"39acabc4a0203520c6929f6b3a6279fd7798be4ff9f4919ca003235251bb84ff"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.370075 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" event={"ID":"3376ab1d-924e-4d39-a393-c4f7a452845b","Type":"ContainerStarted","Data":"ef79ecef069aeb4186388588a1cbd810f6fab7603c5fcf90ae6ab6a42861767e"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.409794 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.413403 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:18:59.913192012 +0000 UTC m=+146.722506336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.420359 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hd4xd" event={"ID":"a31617bb-030b-48d7-b312-e7af9b052143","Type":"ContainerStarted","Data":"c06da8cdc623da971da852989668d6e65daaeed3f45108ce8ea2d07604fad1ca"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.425331 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hd4xd" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.441936 5025 patch_prober.go:28] interesting pod/downloads-7954f5f757-hd4xd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.441990 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hd4xd" podUID="a31617bb-030b-48d7-b312-e7af9b052143" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.450831 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hd4xd" podStartSLOduration=125.450811764 podStartE2EDuration="2m5.450811764s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.450001637 +0000 UTC m=+146.259315781" watchObservedRunningTime="2025-10-07 08:18:59.450811764 +0000 UTC m=+146.260125908" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.451015 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2ttnw" event={"ID":"16df6e82-291b-4885-86a8-50276b3e7ef2","Type":"ContainerStarted","Data":"8bd27576fc8934e93093a6aac381ad734537aa613293ac204a781404786e1a8b"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.452282 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.474750 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:18:59 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:18:59 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:18:59 crc kubenswrapper[5025]: healthz check failed Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.474827 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.475181 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hmlpq" event={"ID":"1e02ba47-76b7-4363-8ade-a9f8c42db920","Type":"ContainerStarted","Data":"28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.485594 5025 patch_prober.go:28] interesting pod/console-operator-58897d9998-2ttnw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.485871 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2ttnw" podUID="16df6e82-291b-4885-86a8-50276b3e7ef2" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.497715 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2ttnw" podStartSLOduration=125.497694194 podStartE2EDuration="2m5.497694194s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.487261038 +0000 UTC m=+146.296575182" watchObservedRunningTime="2025-10-07 08:18:59.497694194 +0000 UTC m=+146.307008338" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.508891 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-795lz" event={"ID":"702301a4-cf98-4786-8649-d6e396c775a4","Type":"ContainerStarted","Data":"88ab3e5ae093a5a8e711c7f386f89fbb2aaac4c0b0fd960eda8b62a6ff8cfc6a"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.513130 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.514823 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.014792125 +0000 UTC m=+146.824106269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.541087 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" event={"ID":"385983c4-874d-4095-980c-c2d3763ce8e1","Type":"ContainerStarted","Data":"25f0c35513ee88471c8e3ce7e7411ad3aed2011f8c28b430e970c411fbe8aa05"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.558605 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" event={"ID":"debaf386-9a77-4774-b7fe-b439a1406621","Type":"ContainerStarted","Data":"3fbd4d4d37b4baed593b43be4fc3e83c49a7d6b32663911d2eb558df892443b1"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.558654 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" event={"ID":"debaf386-9a77-4774-b7fe-b439a1406621","Type":"ContainerStarted","Data":"4c51aa013095ff4687fee9fda480a6d953be827efc869e1b3facba94f6b91f4f"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.561937 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.602304 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hmlpq" podStartSLOduration=125.602285084 podStartE2EDuration="2m5.602285084s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.517980388 +0000 UTC m=+146.327294532" watchObservedRunningTime="2025-10-07 08:18:59.602285084 +0000 UTC m=+146.411599228" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.613863 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" event={"ID":"66353c4f-6d67-4155-8b97-5f27145eabdd","Type":"ContainerStarted","Data":"65e96574ca643cbb26416d5ff322c54f32aa82881f2ead72ae3c8d0bf4de0643"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.616103 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.617753 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.117732892 +0000 UTC m=+146.927047216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.630400 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" event={"ID":"e63d4e4f-5fe5-43b2-a7af-8893da0ed732","Type":"ContainerStarted","Data":"b908ea38f502e89b3962b8d457514e7c6bfc37774adae618787b18acfb22afde"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.637772 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" event={"ID":"77dd254d-69a1-4f60-9587-e59f9094626a","Type":"ContainerStarted","Data":"a69b7a6e0167a890ac08e216cf07feffc10439ab4a52b1ff3056a6213b22fe8c"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.651289 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" podStartSLOduration=125.651270112 podStartE2EDuration="2m5.651270112s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.650509977 +0000 UTC m=+146.459824151" watchObservedRunningTime="2025-10-07 08:18:59.651270112 +0000 UTC m=+146.460584256" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.674014 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" event={"ID":"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2","Type":"ContainerStarted","Data":"b9460ef7bce50928049b81e98e07dc02ecfce396e990b7791d9b4c1ee92a5dbd"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.676249 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.678870 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" podStartSLOduration=125.67883146 podStartE2EDuration="2m5.67883146s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.602873043 +0000 UTC m=+146.412187197" watchObservedRunningTime="2025-10-07 08:18:59.67883146 +0000 UTC m=+146.488145604" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.680956 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-h4xpz" podStartSLOduration=125.680948529 podStartE2EDuration="2m5.680948529s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.67601525 +0000 UTC m=+146.485329394" watchObservedRunningTime="2025-10-07 08:18:59.680948529 +0000 UTC m=+146.490262673" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.689575 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" event={"ID":"9d71f73a-264b-459b-9d99-100a71204e60","Type":"ContainerStarted","Data":"5136d8384636c6871f4fdb702e8d16e8c8a0ec262488ebd1783113d5510dc4f9"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.689643 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" event={"ID":"9d71f73a-264b-459b-9d99-100a71204e60","Type":"ContainerStarted","Data":"3e059597d4cc31b02b0a0c70d50b233f153243f850368f1a34c113c4935b41d1"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.690595 5025 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wcnzt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.690644 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" podUID="6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.693894 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" event={"ID":"2010515f-af73-42cf-a3fd-e2a76508e9e0","Type":"ContainerStarted","Data":"8801c29f5d29d609120bbd8dac5e4298df8ce0f1476a7f83cfc3edf72efdd370"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.697834 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" event={"ID":"fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d","Type":"ContainerStarted","Data":"f40e6d1bcbb393e4286d73b69f40b30c07b66ba35cb0ce334ed441f01872e9b4"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.697862 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" event={"ID":"fb9a1e63-7cc6-48ff-b5a4-7f3e54fb626d","Type":"ContainerStarted","Data":"6eec625a52f8c8454dea43da259403be63bef08b92f7a411b299876bd2707458"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.737768 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" podStartSLOduration=125.737696257 podStartE2EDuration="2m5.737696257s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.72630812 +0000 UTC m=+146.535622264" watchObservedRunningTime="2025-10-07 08:18:59.737696257 +0000 UTC m=+146.547010401" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.739106 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.739819 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.239770514 +0000 UTC m=+147.049084648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.740504 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.744932 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.244912299 +0000 UTC m=+147.054226443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.759714 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l" event={"ID":"e8ebca0a-96ab-452a-a93e-7041967c40d1","Type":"ContainerStarted","Data":"78d734e2661706add9d0727e094e5b077e73184ac12221c2f7fef3c8e031fac9"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.772242 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" event={"ID":"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa","Type":"ContainerStarted","Data":"64aa927a975cc6d2307842a53b76029beddf1afa7f297123483c36849ba8a69c"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.781682 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8czpq" event={"ID":"94b36961-ca5f-4fac-a744-118761859a72","Type":"ContainerStarted","Data":"31a412e8272fc23cd4fbf5500a71c8831022ba9d87ee3b7e197775bd4a95549e"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.808887 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9qr74" event={"ID":"52a8a68e-3997-4206-b8df-3650699f2ed7","Type":"ContainerStarted","Data":"46c661df6e3f07bb9084ed8f3f8047c5f898d2e4af5a1db937494dfe6cd9de9f"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.827751 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-kpss8" podStartSLOduration=125.827732338 podStartE2EDuration="2m5.827732338s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.810079759 +0000 UTC m=+146.619393903" watchObservedRunningTime="2025-10-07 08:18:59.827732338 +0000 UTC m=+146.637046482" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.847718 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.850742 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.350708268 +0000 UTC m=+147.160022412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.858842 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" event={"ID":"9fb37bc8-4d3f-438c-850d-3b800b786f95","Type":"ContainerStarted","Data":"fe4c26475d526eba3d08f100aa5f3af0d6508c4aad1a37c7d9e7382c5b73f02c"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.892393 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" event={"ID":"109ab8c4-3dcd-49a9-a966-4ad68758f46a","Type":"ContainerStarted","Data":"0fa2200eed7cd0d962d60f5f7f03c274f6c174d7db5eb5441debccf11a77ce75"} Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.908111 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.923509 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q57jq" podStartSLOduration=125.923482073 podStartE2EDuration="2m5.923482073s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.854009634 +0000 UTC m=+146.663323778" watchObservedRunningTime="2025-10-07 08:18:59.923482073 +0000 UTC m=+146.732796217" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.923654 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" podStartSLOduration=125.923650849 podStartE2EDuration="2m5.923650849s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.907767487 +0000 UTC m=+146.717081631" watchObservedRunningTime="2025-10-07 08:18:59.923650849 +0000 UTC m=+146.732964993" Oct 07 08:18:59 crc kubenswrapper[5025]: I1007 08:18:59.950745 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:18:59 crc kubenswrapper[5025]: E1007 08:18:59.951062 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.451046561 +0000 UTC m=+147.260360705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.007269 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8lwpw" podStartSLOduration=126.007243791 podStartE2EDuration="2m6.007243791s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.948636263 +0000 UTC m=+146.757950407" watchObservedRunningTime="2025-10-07 08:19:00.007243791 +0000 UTC m=+146.816557935" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.009348 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9qr74" podStartSLOduration=7.009338209 podStartE2EDuration="7.009338209s" podCreationTimestamp="2025-10-07 08:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:18:59.993295592 +0000 UTC m=+146.802609736" watchObservedRunningTime="2025-10-07 08:19:00.009338209 +0000 UTC m=+146.818652353" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.013921 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" podStartSLOduration=126.013901266 podStartE2EDuration="2m6.013901266s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:00.010957031 +0000 UTC m=+146.820271175" watchObservedRunningTime="2025-10-07 08:19:00.013901266 +0000 UTC m=+146.823215410" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.054069 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.054480 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.554460813 +0000 UTC m=+147.363774957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.156046 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.156577 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.656523522 +0000 UTC m=+147.465837726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.257740 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.257910 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.757884367 +0000 UTC m=+147.567198511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.258149 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.258458 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.758450345 +0000 UTC m=+147.567764489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.359307 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.359507 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.85947668 +0000 UTC m=+147.668790824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.359848 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.360263 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.860252236 +0000 UTC m=+147.669566430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.460673 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.460845 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.960823035 +0000 UTC m=+147.770137179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.461258 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.461571 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:00.961559969 +0000 UTC m=+147.770874113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.465995 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:19:00 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:19:00 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:19:00 crc kubenswrapper[5025]: healthz check failed Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.466056 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.562981 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.563172 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.063145323 +0000 UTC m=+147.872459467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.563219 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.563694 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.06368471 +0000 UTC m=+147.872998854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.577727 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.577792 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.579820 5025 patch_prober.go:28] interesting pod/apiserver-76f77b778f-2z5w6 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.579868 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" podUID="66353c4f-6d67-4155-8b97-5f27145eabdd" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.639606 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.639665 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.650190 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.664701 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.665016 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.165000584 +0000 UTC m=+147.974314728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.766637 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.766998 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.26698309 +0000 UTC m=+148.076297234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.868159 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.868316 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.368292905 +0000 UTC m=+148.177607059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.868687 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.869079 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.369063709 +0000 UTC m=+148.178377853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.908557 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" event={"ID":"99b101e0-bca4-40fe-942f-7ab5d06be284","Type":"ContainerStarted","Data":"03deca579a5910385ba9f6e3f3967deaea654e507112ce36010e17db369e663c"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.911745 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-795lz" event={"ID":"702301a4-cf98-4786-8649-d6e396c775a4","Type":"ContainerStarted","Data":"bc4125ab96c7deb2e53bf6706f10176a5616bf7a83514b106d391eb748d04dea"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.912403 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-795lz" event={"ID":"702301a4-cf98-4786-8649-d6e396c775a4","Type":"ContainerStarted","Data":"256631b9e9aa76e277e057acfce03864a985ce92a6a85fec0c58d67796759023"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.912461 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-795lz" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.916394 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z8xqx" event={"ID":"9d71f73a-264b-459b-9d99-100a71204e60","Type":"ContainerStarted","Data":"ed510eec02e6c247c568054134a50cdcf46136c8757398ed8713bbd000c8451b"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.923762 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" event={"ID":"77dd254d-69a1-4f60-9587-e59f9094626a","Type":"ContainerStarted","Data":"a8f42f219370c1da78f41ecd0746e3e33747d0e0fd959cfbb9313d6c4999f87b"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.926660 5025 generic.go:334] "Generic (PLEG): container finished" podID="3376ab1d-924e-4d39-a393-c4f7a452845b" containerID="ef79ecef069aeb4186388588a1cbd810f6fab7603c5fcf90ae6ab6a42861767e" exitCode=0 Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.926821 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" event={"ID":"3376ab1d-924e-4d39-a393-c4f7a452845b","Type":"ContainerDied","Data":"ef79ecef069aeb4186388588a1cbd810f6fab7603c5fcf90ae6ab6a42861767e"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.926866 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" event={"ID":"3376ab1d-924e-4d39-a393-c4f7a452845b","Type":"ContainerStarted","Data":"f3b95b4b3be20cfa1f6a3e304049a7b14431b466b9935a6c64f23f6c9cb31d90"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.926933 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.929202 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" event={"ID":"b3110e75-2479-4c2b-b96b-0f41a1f10cec","Type":"ContainerStarted","Data":"71a91b22bfc8efb92eab949a6853e02f3737245d3797041ebea82d509f4b7c45"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.929375 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.931475 5025 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rvr8c container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.931522 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" podUID="b3110e75-2479-4c2b-b96b-0f41a1f10cec" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.933327 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" event={"ID":"25634103-cd17-4005-8c8e-575d58e0826e","Type":"ContainerStarted","Data":"4a5796169848a34343401ce05ce65665edb079c510f711f0ca24eaca172608c9"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.934927 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" event={"ID":"9fb37bc8-4d3f-438c-850d-3b800b786f95","Type":"ContainerStarted","Data":"4392df6e5350630286e482c961ffb2d257180138fe1867801914e14cc044ea77"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.939926 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" event={"ID":"385983c4-874d-4095-980c-c2d3763ce8e1","Type":"ContainerStarted","Data":"7c67b4e23b0192671f5392e63c59ff39f04a074f5d8d975a0c03f4e5f8144751"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.942689 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" event={"ID":"6a3782b5-7407-485b-a426-58413aba5747","Type":"ContainerStarted","Data":"5dd33ed69e6bb63fabf41dc2a806cff5cd5516841a62b85da2911730d77e1fec"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.942893 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.944772 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8czpq" event={"ID":"94b36961-ca5f-4fac-a744-118761859a72","Type":"ContainerStarted","Data":"f95e6ef057d18969e4aed588d3b762b6100960370739aef072e51ef7c92cc439"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.945469 5025 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qzlm7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.945513 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" podUID="6a3782b5-7407-485b-a426-58413aba5747" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.947091 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" event={"ID":"4ea4607c-59be-4461-8fa4-9e22b87036f8","Type":"ContainerStarted","Data":"ad0e9d62dbc29cea9f8044c2f785dee8fd378988bb05244791607339c197c994"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.950056 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" event={"ID":"9ae035c8-618a-4a3b-afbc-e8173548cb32","Type":"ContainerStarted","Data":"10b55f1d69b68b2da2ec785848bf3d247ffd033088678e2c0e27522c2fd2e9dd"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.950130 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" event={"ID":"9ae035c8-618a-4a3b-afbc-e8173548cb32","Type":"ContainerStarted","Data":"7e196dd11b9adbc561bb03b3d6a71b48cf9fcc7871c487401cd3830a17455942"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.963058 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xwd6l" podStartSLOduration=126.963037537 podStartE2EDuration="2m6.963037537s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:00.962286343 +0000 UTC m=+147.771600487" watchObservedRunningTime="2025-10-07 08:19:00.963037537 +0000 UTC m=+147.772351681" Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.969776 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.969982 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.469941839 +0000 UTC m=+148.279255983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.970125 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:00 crc kubenswrapper[5025]: E1007 08:19:00.970574 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.47055328 +0000 UTC m=+148.279867424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.973770 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5m9cc" event={"ID":"cafb7a37-dac6-45f8-9bf9-d8820d7dedaa","Type":"ContainerStarted","Data":"282071226e85d78b3b279be2abba6e9e5bea0be2f38b1b20d4738cff35af54f4"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.986301 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" event={"ID":"6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2","Type":"ContainerStarted","Data":"2996b0dcaf9a364a6aad2225017e5994286a4b2909d92e4f341db470ea942733"} Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.987652 5025 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wcnzt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 07 08:19:00 crc kubenswrapper[5025]: I1007 08:19:00.987691 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" podUID="6a140c8d-40f7-462f-8ffd-e1ea2fa4a1e2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.026165 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" event={"ID":"66353c4f-6d67-4155-8b97-5f27145eabdd","Type":"ContainerStarted","Data":"9691f08cb66eed0ea7600a7257996a3a785c35aa9cb7790823c661040b505a9b"} Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.064223 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l" event={"ID":"e8ebca0a-96ab-452a-a93e-7041967c40d1","Type":"ContainerStarted","Data":"f9b2d0d0ddc6d97800bd5dbfa61efe2fc093b1ad2a5bec036ecac17dce68bb4e"} Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.064318 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l" event={"ID":"e8ebca0a-96ab-452a-a93e-7041967c40d1","Type":"ContainerStarted","Data":"3fc1b37c2d178f25d179899a661849f6c358eb6ee85a9c15243a08686ac7f747"} Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.067676 5025 patch_prober.go:28] interesting pod/downloads-7954f5f757-hd4xd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.067735 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hd4xd" podUID="a31617bb-030b-48d7-b312-e7af9b052143" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.072297 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qpzs2" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.073255 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.073622 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.573593549 +0000 UTC m=+148.382907703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.086944 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.090009 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-795lz" podStartSLOduration=8.089984718 podStartE2EDuration="8.089984718s" podCreationTimestamp="2025-10-07 08:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.067883245 +0000 UTC m=+147.877197399" watchObservedRunningTime="2025-10-07 08:19:01.089984718 +0000 UTC m=+147.899298862" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.100859 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.101609 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.601593051 +0000 UTC m=+148.410907195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.125521 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fp4pg" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.202846 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.202992 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.702960537 +0000 UTC m=+148.512274681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.203553 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.205992 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.705976345 +0000 UTC m=+148.515290489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.209434 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" podStartSLOduration=127.209418206 podStartE2EDuration="2m7.209418206s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.161898204 +0000 UTC m=+147.971212348" watchObservedRunningTime="2025-10-07 08:19:01.209418206 +0000 UTC m=+148.018732350" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.215221 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2ttnw" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.240882 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" podStartSLOduration=127.240860228 podStartE2EDuration="2m7.240860228s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.240457285 +0000 UTC m=+148.049771429" watchObservedRunningTime="2025-10-07 08:19:01.240860228 +0000 UTC m=+148.050174372" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.242284 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-npvk6" podStartSLOduration=127.242278144 podStartE2EDuration="2m7.242278144s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.209339942 +0000 UTC m=+148.018654086" watchObservedRunningTime="2025-10-07 08:19:01.242278144 +0000 UTC m=+148.051592288" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.281806 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xf6r2" podStartSLOduration=127.281785657 podStartE2EDuration="2m7.281785657s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.280304989 +0000 UTC m=+148.089619143" watchObservedRunningTime="2025-10-07 08:19:01.281785657 +0000 UTC m=+148.091099801" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.311086 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.311276 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.811244286 +0000 UTC m=+148.620558430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.311368 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.311743 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.811730982 +0000 UTC m=+148.621045116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.321723 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-29wzh" podStartSLOduration=127.321682572 podStartE2EDuration="2m7.321682572s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.320333669 +0000 UTC m=+148.129647813" watchObservedRunningTime="2025-10-07 08:19:01.321682572 +0000 UTC m=+148.130996726" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.413052 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.413197 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.913172671 +0000 UTC m=+148.722486815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.413282 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.413578 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:01.913571084 +0000 UTC m=+148.722885228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.427445 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" podStartSLOduration=127.42742552 podStartE2EDuration="2m7.42742552s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.361165625 +0000 UTC m=+148.170479789" watchObservedRunningTime="2025-10-07 08:19:01.42742552 +0000 UTC m=+148.236739664" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.471753 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:19:01 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:19:01 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:19:01 crc kubenswrapper[5025]: healthz check failed Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.471820 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.500123 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6pkw" podStartSLOduration=127.500105761 podStartE2EDuration="2m7.500105761s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.426982665 +0000 UTC m=+148.236296819" watchObservedRunningTime="2025-10-07 08:19:01.500105761 +0000 UTC m=+148.309419895" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.514934 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.515140 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.015113045 +0000 UTC m=+148.824427189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.515302 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.515658 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.015648922 +0000 UTC m=+148.824963146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.567077 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8czpq" podStartSLOduration=8.567059098 podStartE2EDuration="8.567059098s" podCreationTimestamp="2025-10-07 08:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.565846659 +0000 UTC m=+148.375160803" watchObservedRunningTime="2025-10-07 08:19:01.567059098 +0000 UTC m=+148.376373242" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.567779 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-bfxvx" podStartSLOduration=127.567774972 podStartE2EDuration="2m7.567774972s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.501449414 +0000 UTC m=+148.310763558" watchObservedRunningTime="2025-10-07 08:19:01.567774972 +0000 UTC m=+148.377089116" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.598219 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" podStartSLOduration=127.598200602 podStartE2EDuration="2m7.598200602s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.596524208 +0000 UTC m=+148.405838352" watchObservedRunningTime="2025-10-07 08:19:01.598200602 +0000 UTC m=+148.407514746" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.616319 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.616484 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.11645914 +0000 UTC m=+148.925773284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.616701 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.617026 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.117013398 +0000 UTC m=+148.926327542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.717957 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.718192 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.218158447 +0000 UTC m=+149.027472591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.718300 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.718658 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.218643782 +0000 UTC m=+149.027957926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.756795 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x9t4l" podStartSLOduration=127.756777381 podStartE2EDuration="2m7.756777381s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:01.689242545 +0000 UTC m=+148.498556689" watchObservedRunningTime="2025-10-07 08:19:01.756777381 +0000 UTC m=+148.566091525" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.824744 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.824936 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.324908866 +0000 UTC m=+149.134223010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.824978 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.825087 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.825118 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.825675 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.32564756 +0000 UTC m=+149.134961704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.826158 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.838896 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.931626 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.931843 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.932088 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:19:01 crc kubenswrapper[5025]: E1007 08:19:01.933876 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.433836936 +0000 UTC m=+149.243151080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.942637 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.949295 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.957793 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:19:01 crc kubenswrapper[5025]: I1007 08:19:01.968815 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.042031 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.042366 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.542353793 +0000 UTC m=+149.351667937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.130441 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jb77m" event={"ID":"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc","Type":"ContainerStarted","Data":"b0fc219e171d60d0bd349b459af11693480fa3b08397a27095e2eac83dbcb4d8"} Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.139962 5025 generic.go:334] "Generic (PLEG): container finished" podID="385983c4-874d-4095-980c-c2d3763ce8e1" containerID="7c67b4e23b0192671f5392e63c59ff39f04a074f5d8d975a0c03f4e5f8144751" exitCode=0 Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.140993 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" event={"ID":"385983c4-874d-4095-980c-c2d3763ce8e1","Type":"ContainerDied","Data":"7c67b4e23b0192671f5392e63c59ff39f04a074f5d8d975a0c03f4e5f8144751"} Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.143663 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.143872 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.643851563 +0000 UTC m=+149.453165707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.144274 5025 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rvr8c container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.144312 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" podUID="b3110e75-2479-4c2b-b96b-0f41a1f10cec" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.144333 5025 patch_prober.go:28] interesting pod/downloads-7954f5f757-hd4xd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.144404 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hd4xd" podUID="a31617bb-030b-48d7-b312-e7af9b052143" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.150912 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wcnzt" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.240363 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.246216 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.247518 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.747497942 +0000 UTC m=+149.556812186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.347498 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.347639 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.847620627 +0000 UTC m=+149.656934771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.347802 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.348070 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.848062832 +0000 UTC m=+149.657376976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.448697 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.448996 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:02.948978844 +0000 UTC m=+149.758292988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.475114 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:19:02 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:19:02 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:19:02 crc kubenswrapper[5025]: healthz check failed Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.475184 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.550160 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.551516 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.051500616 +0000 UTC m=+149.860814760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.654071 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.654437 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.154417673 +0000 UTC m=+149.963731817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.756372 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.756685 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.256672587 +0000 UTC m=+150.065986731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.852994 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lp7dp"] Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.854225 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.857622 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.858011 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.357992431 +0000 UTC m=+150.167306575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.897632 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.903308 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lp7dp"] Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.958692 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjbf8\" (UniqueName: \"kubernetes.io/projected/77520001-a1f9-4018-bc4c-2964849da6c7-kube-api-access-qjbf8\") pod \"community-operators-lp7dp\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.958749 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-catalog-content\") pod \"community-operators-lp7dp\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.958786 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:02 crc kubenswrapper[5025]: I1007 08:19:02.958806 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-utilities\") pod \"community-operators-lp7dp\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:02 crc kubenswrapper[5025]: E1007 08:19:02.959278 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.459256714 +0000 UTC m=+150.268570858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.050045 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wx6rt"] Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.056070 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.059315 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.059463 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.559435992 +0000 UTC m=+150.368750136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.059668 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjbf8\" (UniqueName: \"kubernetes.io/projected/77520001-a1f9-4018-bc4c-2964849da6c7-kube-api-access-qjbf8\") pod \"community-operators-lp7dp\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.059720 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-catalog-content\") pod \"community-operators-lp7dp\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.059760 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.059785 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-utilities\") pod \"community-operators-lp7dp\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.060100 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.560083463 +0000 UTC m=+150.369397607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.060211 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-catalog-content\") pod \"community-operators-lp7dp\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.060505 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-utilities\") pod \"community-operators-lp7dp\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.069168 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.082183 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wx6rt"] Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.142753 5025 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qzlm7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.142846 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" podUID="6a3782b5-7407-485b-a426-58413aba5747" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.145342 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjbf8\" (UniqueName: \"kubernetes.io/projected/77520001-a1f9-4018-bc4c-2964849da6c7-kube-api-access-qjbf8\") pod \"community-operators-lp7dp\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.164699 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.165174 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-catalog-content\") pod \"certified-operators-wx6rt\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.165238 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn2n7\" (UniqueName: \"kubernetes.io/projected/d30db05d-5f5e-4586-92de-f877c5a64600-kube-api-access-fn2n7\") pod \"certified-operators-wx6rt\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.165307 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-utilities\") pod \"certified-operators-wx6rt\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.165463 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.665443558 +0000 UTC m=+150.474757712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.217875 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"323e471e04cf689d2ae532d10e15dcf248f49adb97649cbc0286225c8be158e1"} Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.235021 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.246644 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jb77m" event={"ID":"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc","Type":"ContainerStarted","Data":"96af7d64f8e4b4630e41bfa697c984d5556cad8ecb4a05a3c6d75c58495a7752"} Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.252374 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z7c4f"] Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.253391 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.267873 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn2n7\" (UniqueName: \"kubernetes.io/projected/d30db05d-5f5e-4586-92de-f877c5a64600-kube-api-access-fn2n7\") pod \"certified-operators-wx6rt\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.267926 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-utilities\") pod \"certified-operators-wx6rt\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.267965 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.267990 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-catalog-content\") pod \"certified-operators-wx6rt\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.268392 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-catalog-content\") pod \"certified-operators-wx6rt\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.268831 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-utilities\") pod \"certified-operators-wx6rt\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.269048 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.769036206 +0000 UTC m=+150.578350350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.282152 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z7c4f"] Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.309411 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn2n7\" (UniqueName: \"kubernetes.io/projected/d30db05d-5f5e-4586-92de-f877c5a64600-kube-api-access-fn2n7\") pod \"certified-operators-wx6rt\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.368599 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.368869 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-catalog-content\") pod \"community-operators-z7c4f\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.369047 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nzdk\" (UniqueName: \"kubernetes.io/projected/11ec0e09-2a87-483f-bdac-73b3f3995a8a-kube-api-access-7nzdk\") pod \"community-operators-z7c4f\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.369075 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-utilities\") pod \"community-operators-z7c4f\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.370143 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.870120402 +0000 UTC m=+150.679434616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.437315 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6kl5d"] Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.442397 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.471335 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nzdk\" (UniqueName: \"kubernetes.io/projected/11ec0e09-2a87-483f-bdac-73b3f3995a8a-kube-api-access-7nzdk\") pod \"community-operators-z7c4f\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.471381 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-utilities\") pod \"community-operators-z7c4f\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.471416 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.471437 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-catalog-content\") pod \"community-operators-z7c4f\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.471872 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kl5d"] Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.471884 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-catalog-content\") pod \"community-operators-z7c4f\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.472245 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-utilities\") pod \"community-operators-z7c4f\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.472297 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:03.972286865 +0000 UTC m=+150.781601009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.472457 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.475871 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:19:03 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:19:03 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:19:03 crc kubenswrapper[5025]: healthz check failed Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.475906 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.508614 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nzdk\" (UniqueName: \"kubernetes.io/projected/11ec0e09-2a87-483f-bdac-73b3f3995a8a-kube-api-access-7nzdk\") pod \"community-operators-z7c4f\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.572408 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.572605 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-catalog-content\") pod \"certified-operators-6kl5d\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.572684 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:04.072652428 +0000 UTC m=+150.881966572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.572772 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.572828 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbr8b\" (UniqueName: \"kubernetes.io/projected/109518fe-ba3b-4b94-bbf3-64c30e50f253-kube-api-access-rbr8b\") pod \"certified-operators-6kl5d\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.572852 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-utilities\") pod \"certified-operators-6kl5d\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.573272 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:04.073263768 +0000 UTC m=+150.882577912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.582799 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.635468 5025 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.673860 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.674101 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-catalog-content\") pod \"certified-operators-6kl5d\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.674145 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbr8b\" (UniqueName: \"kubernetes.io/projected/109518fe-ba3b-4b94-bbf3-64c30e50f253-kube-api-access-rbr8b\") pod \"certified-operators-6kl5d\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.674163 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-utilities\") pod \"certified-operators-6kl5d\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.674612 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-utilities\") pod \"certified-operators-6kl5d\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.674680 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:04.174664165 +0000 UTC m=+150.983978309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.674871 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-catalog-content\") pod \"certified-operators-6kl5d\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.717299 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbr8b\" (UniqueName: \"kubernetes.io/projected/109518fe-ba3b-4b94-bbf3-64c30e50f253-kube-api-access-rbr8b\") pod \"certified-operators-6kl5d\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: W1007 08:19:03.756101 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9ad336db0a2b9a1f9ed962ac629286edd47aab897dd529ca198277ca9069612d WatchSource:0}: Error finding container 9ad336db0a2b9a1f9ed962ac629286edd47aab897dd529ca198277ca9069612d: Status 404 returned error can't find the container with id 9ad336db0a2b9a1f9ed962ac629286edd47aab897dd529ca198277ca9069612d Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.777668 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.778073 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:04.278058066 +0000 UTC m=+151.087372210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.794058 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.878371 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.878893 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:04.378876194 +0000 UTC m=+151.188190338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:03 crc kubenswrapper[5025]: I1007 08:19:03.986664 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:03 crc kubenswrapper[5025]: E1007 08:19:03.987055 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 08:19:04.487038 +0000 UTC m=+151.296352154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2r94k" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.059578 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lp7dp"] Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.081655 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.082755 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.089329 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:04 crc kubenswrapper[5025]: E1007 08:19:04.089730 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 08:19:04.589712107 +0000 UTC m=+151.399026251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.089864 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.089958 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.096796 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.110682 5025 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-07T08:19:03.635777612Z","Handler":null,"Name":""} Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.142422 5025 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.142453 5025 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.219892 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.266498 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.266831 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/383d53df-cd53-4fe8-86f9-28cff5f20add-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"383d53df-cd53-4fe8-86f9-28cff5f20add\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.267165 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/383d53df-cd53-4fe8-86f9-28cff5f20add-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"383d53df-cd53-4fe8-86f9-28cff5f20add\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.283857 5025 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.297626 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.298478 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9ad336db0a2b9a1f9ed962ac629286edd47aab897dd529ca198277ca9069612d"} Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.307991 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z7c4f"] Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.398036 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw5xc\" (UniqueName: \"kubernetes.io/projected/385983c4-874d-4095-980c-c2d3763ce8e1-kube-api-access-mw5xc\") pod \"385983c4-874d-4095-980c-c2d3763ce8e1\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.398667 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/385983c4-874d-4095-980c-c2d3763ce8e1-config-volume\") pod \"385983c4-874d-4095-980c-c2d3763ce8e1\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.398748 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/385983c4-874d-4095-980c-c2d3763ce8e1-secret-volume\") pod \"385983c4-874d-4095-980c-c2d3763ce8e1\" (UID: \"385983c4-874d-4095-980c-c2d3763ce8e1\") " Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.398980 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/383d53df-cd53-4fe8-86f9-28cff5f20add-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"383d53df-cd53-4fe8-86f9-28cff5f20add\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.399070 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/383d53df-cd53-4fe8-86f9-28cff5f20add-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"383d53df-cd53-4fe8-86f9-28cff5f20add\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.400295 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/385983c4-874d-4095-980c-c2d3763ce8e1-config-volume" (OuterVolumeSpecName: "config-volume") pod "385983c4-874d-4095-980c-c2d3763ce8e1" (UID: "385983c4-874d-4095-980c-c2d3763ce8e1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.400720 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/383d53df-cd53-4fe8-86f9-28cff5f20add-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"383d53df-cd53-4fe8-86f9-28cff5f20add\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.413188 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jb77m" event={"ID":"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc","Type":"ContainerStarted","Data":"97433a7a685cb400aa73eb5502127598cd4aa3f41ad716696f57a4ad728360c3"} Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.413236 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jb77m" event={"ID":"784ef07a-3cbd-46f4-9ce5-90c1a6d75edc","Type":"ContainerStarted","Data":"0e8a89011f1e465b4af7067183b97fc78201b705f4cc36f64d5cd6935794fe10"} Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.418847 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2r94k\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.426074 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wx6rt"] Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.450759 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4b4e0261cd0c9685e564867a143a0ce4bce3bb70a1ad402a7a9f4e7efc5cd7ad"} Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.450806 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d04bf390ec887bb133205971edf3353cdc0e6cd2c1859e17134d2197e5c93066"} Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.467353 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/383d53df-cd53-4fe8-86f9-28cff5f20add-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"383d53df-cd53-4fe8-86f9-28cff5f20add\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.471818 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:19:04 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:19:04 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:19:04 crc kubenswrapper[5025]: healthz check failed Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.472513 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.473099 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.473102 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz" event={"ID":"385983c4-874d-4095-980c-c2d3763ce8e1","Type":"ContainerDied","Data":"25f0c35513ee88471c8e3ce7e7411ad3aed2011f8c28b430e970c411fbe8aa05"} Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.473186 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f0c35513ee88471c8e3ce7e7411ad3aed2011f8c28b430e970c411fbe8aa05" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.474089 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385983c4-874d-4095-980c-c2d3763ce8e1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "385983c4-874d-4095-980c-c2d3763ce8e1" (UID: "385983c4-874d-4095-980c-c2d3763ce8e1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.476233 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385983c4-874d-4095-980c-c2d3763ce8e1-kube-api-access-mw5xc" (OuterVolumeSpecName: "kube-api-access-mw5xc") pod "385983c4-874d-4095-980c-c2d3763ce8e1" (UID: "385983c4-874d-4095-980c-c2d3763ce8e1"). InnerVolumeSpecName "kube-api-access-mw5xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.476556 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lp7dp" event={"ID":"77520001-a1f9-4018-bc4c-2964849da6c7","Type":"ContainerStarted","Data":"398072bc39083a99c2c33079697d0ac157e44ac072c3c8c178f0d27f714341ef"} Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.485519 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jb77m" podStartSLOduration=11.485485719 podStartE2EDuration="11.485485719s" podCreationTimestamp="2025-10-07 08:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:04.474124833 +0000 UTC m=+151.283438977" watchObservedRunningTime="2025-10-07 08:19:04.485485719 +0000 UTC m=+151.294799863" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.496313 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"791c16ca5f7d617aefb45287f2c929637548b3a801deb444e667b1e412d0fc3c"} Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.496984 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.502809 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.503277 5025 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/385983c4-874d-4095-980c-c2d3763ce8e1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.503374 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw5xc\" (UniqueName: \"kubernetes.io/projected/385983c4-874d-4095-980c-c2d3763ce8e1-kube-api-access-mw5xc\") on node \"crc\" DevicePath \"\"" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.503460 5025 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/385983c4-874d-4095-980c-c2d3763ce8e1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.508660 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.580751 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.636219 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kl5d"] Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.674107 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 08:19:04 crc kubenswrapper[5025]: I1007 08:19:04.950461 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2r94k"] Oct 07 08:19:04 crc kubenswrapper[5025]: W1007 08:19:04.981015 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11df844_49f0_4c9a_9de5_701e57c69685.slice/crio-7943a811149f7a376febf6be333a7a353440cf97170741c4139204be5e89a6b5 WatchSource:0}: Error finding container 7943a811149f7a376febf6be333a7a353440cf97170741c4139204be5e89a6b5: Status 404 returned error can't find the container with id 7943a811149f7a376febf6be333a7a353440cf97170741c4139204be5e89a6b5 Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.026750 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-szdzh"] Oct 07 08:19:05 crc kubenswrapper[5025]: E1007 08:19:05.026967 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385983c4-874d-4095-980c-c2d3763ce8e1" containerName="collect-profiles" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.026977 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="385983c4-874d-4095-980c-c2d3763ce8e1" containerName="collect-profiles" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.027077 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="385983c4-874d-4095-980c-c2d3763ce8e1" containerName="collect-profiles" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.027800 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.036937 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.055687 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szdzh"] Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.114385 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-catalog-content\") pod \"redhat-marketplace-szdzh\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.114426 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d2lk\" (UniqueName: \"kubernetes.io/projected/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-kube-api-access-7d2lk\") pod \"redhat-marketplace-szdzh\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.114567 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-utilities\") pod \"redhat-marketplace-szdzh\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.218957 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d2lk\" (UniqueName: \"kubernetes.io/projected/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-kube-api-access-7d2lk\") pod \"redhat-marketplace-szdzh\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.219037 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-catalog-content\") pod \"redhat-marketplace-szdzh\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.219152 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-utilities\") pod \"redhat-marketplace-szdzh\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.219821 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-utilities\") pod \"redhat-marketplace-szdzh\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.220444 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-catalog-content\") pod \"redhat-marketplace-szdzh\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.228840 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.247262 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d2lk\" (UniqueName: \"kubernetes.io/projected/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-kube-api-access-7d2lk\") pod \"redhat-marketplace-szdzh\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.369237 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.418981 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-psjml"] Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.420154 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.432136 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-psjml"] Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.468742 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:19:05 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:19:05 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:19:05 crc kubenswrapper[5025]: healthz check failed Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.468794 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.515457 5025 generic.go:334] "Generic (PLEG): container finished" podID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerID="10a5787ad42dff5c562ad72d022aea12a8033684ea8e2ae9a80ffda542069aaa" exitCode=0 Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.515585 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7c4f" event={"ID":"11ec0e09-2a87-483f-bdac-73b3f3995a8a","Type":"ContainerDied","Data":"10a5787ad42dff5c562ad72d022aea12a8033684ea8e2ae9a80ffda542069aaa"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.515614 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7c4f" event={"ID":"11ec0e09-2a87-483f-bdac-73b3f3995a8a","Type":"ContainerStarted","Data":"466136cc9952b6baeecc7a17a0698c69f6afc681d69c493a56b8c6d29380ce5c"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.518366 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.522320 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz9z4\" (UniqueName: \"kubernetes.io/projected/fc8dbf8c-119b-4723-9479-ddad0026dcbd-kube-api-access-sz9z4\") pod \"redhat-marketplace-psjml\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.522434 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-utilities\") pod \"redhat-marketplace-psjml\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.522483 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-catalog-content\") pod \"redhat-marketplace-psjml\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.528448 5025 generic.go:334] "Generic (PLEG): container finished" podID="d30db05d-5f5e-4586-92de-f877c5a64600" containerID="4d447b20ce9ff07e37952ecb03ecf52cda11e2da443d5764543a10055d5a4ca1" exitCode=0 Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.528519 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx6rt" event={"ID":"d30db05d-5f5e-4586-92de-f877c5a64600","Type":"ContainerDied","Data":"4d447b20ce9ff07e37952ecb03ecf52cda11e2da443d5764543a10055d5a4ca1"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.528619 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx6rt" event={"ID":"d30db05d-5f5e-4586-92de-f877c5a64600","Type":"ContainerStarted","Data":"16c67fff3aca66eb11c72f69a15c5412a76863c60d399a89304ed2504f02a948"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.535949 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"383d53df-cd53-4fe8-86f9-28cff5f20add","Type":"ContainerStarted","Data":"43fa842fc10b1683c62d582c2595949ee344a5f97e35031dce60d80c4733dbd1"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.547064 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" event={"ID":"d11df844-49f0-4c9a-9de5-701e57c69685","Type":"ContainerStarted","Data":"ab6e635ad109e876d99329499bb911c8278b3a342759918efaf56a5c7ed56c46"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.547117 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" event={"ID":"d11df844-49f0-4c9a-9de5-701e57c69685","Type":"ContainerStarted","Data":"7943a811149f7a376febf6be333a7a353440cf97170741c4139204be5e89a6b5"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.547265 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.551109 5025 generic.go:334] "Generic (PLEG): container finished" podID="77520001-a1f9-4018-bc4c-2964849da6c7" containerID="c507095dfe58b2e1197766ac61495d7d31346c9967c184017f68289596542ec0" exitCode=0 Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.551194 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lp7dp" event={"ID":"77520001-a1f9-4018-bc4c-2964849da6c7","Type":"ContainerDied","Data":"c507095dfe58b2e1197766ac61495d7d31346c9967c184017f68289596542ec0"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.574567 5025 generic.go:334] "Generic (PLEG): container finished" podID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerID="269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c" exitCode=0 Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.574976 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kl5d" event={"ID":"109518fe-ba3b-4b94-bbf3-64c30e50f253","Type":"ContainerDied","Data":"269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.575133 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kl5d" event={"ID":"109518fe-ba3b-4b94-bbf3-64c30e50f253","Type":"ContainerStarted","Data":"84e5a0982ed17c9c632b6dd00a40ba2e327496337e61f4d60c877852a1721edb"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.594975 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e7c47d8e8d01c740cd3e1a9dd3c4dd29d8f1d441cc3a4c6aa43e48865941fc16"} Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.610006 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" podStartSLOduration=131.60998439 podStartE2EDuration="2m11.60998439s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:05.585040476 +0000 UTC m=+152.394354620" watchObservedRunningTime="2025-10-07 08:19:05.60998439 +0000 UTC m=+152.419298554" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.611232 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.618843 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-2z5w6" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.624385 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-utilities\") pod \"redhat-marketplace-psjml\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.624468 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-catalog-content\") pod \"redhat-marketplace-psjml\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.625317 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz9z4\" (UniqueName: \"kubernetes.io/projected/fc8dbf8c-119b-4723-9479-ddad0026dcbd-kube-api-access-sz9z4\") pod \"redhat-marketplace-psjml\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.626571 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-utilities\") pod \"redhat-marketplace-psjml\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.627328 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-catalog-content\") pod \"redhat-marketplace-psjml\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.657994 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz9z4\" (UniqueName: \"kubernetes.io/projected/fc8dbf8c-119b-4723-9479-ddad0026dcbd-kube-api-access-sz9z4\") pod \"redhat-marketplace-psjml\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.680714 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szdzh"] Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.707018 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lsrzs" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.764166 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:19:05 crc kubenswrapper[5025]: I1007 08:19:05.927087 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.019475 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dmcm5"] Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.020778 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.023173 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.034933 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmcm5"] Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.143182 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhd5d\" (UniqueName: \"kubernetes.io/projected/5e0afdb0-5693-4263-8c52-9cccc4878003-kube-api-access-lhd5d\") pod \"redhat-operators-dmcm5\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.143254 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-catalog-content\") pod \"redhat-operators-dmcm5\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.143282 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-utilities\") pod \"redhat-operators-dmcm5\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.244950 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhd5d\" (UniqueName: \"kubernetes.io/projected/5e0afdb0-5693-4263-8c52-9cccc4878003-kube-api-access-lhd5d\") pod \"redhat-operators-dmcm5\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.245007 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-catalog-content\") pod \"redhat-operators-dmcm5\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.245039 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-utilities\") pod \"redhat-operators-dmcm5\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.246077 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-utilities\") pod \"redhat-operators-dmcm5\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.246394 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-catalog-content\") pod \"redhat-operators-dmcm5\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.265889 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhd5d\" (UniqueName: \"kubernetes.io/projected/5e0afdb0-5693-4263-8c52-9cccc4878003-kube-api-access-lhd5d\") pod \"redhat-operators-dmcm5\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.301488 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-psjml"] Oct 07 08:19:06 crc kubenswrapper[5025]: W1007 08:19:06.324889 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc8dbf8c_119b_4723_9479_ddad0026dcbd.slice/crio-d5eb2ba28e7fcb190974ab9d41bd389cf9dd1805fec9bfd4881d5f139f974531 WatchSource:0}: Error finding container d5eb2ba28e7fcb190974ab9d41bd389cf9dd1805fec9bfd4881d5f139f974531: Status 404 returned error can't find the container with id d5eb2ba28e7fcb190974ab9d41bd389cf9dd1805fec9bfd4881d5f139f974531 Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.363928 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.404170 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.404214 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.406639 5025 patch_prober.go:28] interesting pod/console-f9d7485db-hmlpq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.406706 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hmlpq" podUID="1e02ba47-76b7-4363-8ade-a9f8c42db920" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.434218 5025 patch_prober.go:28] interesting pod/downloads-7954f5f757-hd4xd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.434307 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hd4xd" podUID="a31617bb-030b-48d7-b312-e7af9b052143" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.435852 5025 patch_prober.go:28] interesting pod/downloads-7954f5f757-hd4xd container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.435890 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hd4xd" podUID="a31617bb-030b-48d7-b312-e7af9b052143" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.461865 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rjn8v"] Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.464715 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.465759 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.467202 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:19:06 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:19:06 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:19:06 crc kubenswrapper[5025]: healthz check failed Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.467282 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.502480 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rjn8v"] Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.552370 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-catalog-content\") pod \"redhat-operators-rjn8v\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.552489 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd2h6\" (UniqueName: \"kubernetes.io/projected/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-kube-api-access-pd2h6\") pod \"redhat-operators-rjn8v\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.552557 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-utilities\") pod \"redhat-operators-rjn8v\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.646313 5025 generic.go:334] "Generic (PLEG): container finished" podID="383d53df-cd53-4fe8-86f9-28cff5f20add" containerID="80fb5330b7e6133afa9d1052481253c3e05121bfcd2bf3ab72e0060d500d2f3c" exitCode=0 Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.646471 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"383d53df-cd53-4fe8-86f9-28cff5f20add","Type":"ContainerDied","Data":"80fb5330b7e6133afa9d1052481253c3e05121bfcd2bf3ab72e0060d500d2f3c"} Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.654828 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-catalog-content\") pod \"redhat-operators-rjn8v\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.654906 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd2h6\" (UniqueName: \"kubernetes.io/projected/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-kube-api-access-pd2h6\") pod \"redhat-operators-rjn8v\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.654939 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-utilities\") pod \"redhat-operators-rjn8v\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.655472 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-utilities\") pod \"redhat-operators-rjn8v\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.655756 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-catalog-content\") pod \"redhat-operators-rjn8v\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.659014 5025 generic.go:334] "Generic (PLEG): container finished" podID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerID="8cd5a9106e9e52512d80169f11698ae61a4bf91d5c465581cad51435ea79f5f0" exitCode=0 Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.659702 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szdzh" event={"ID":"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7","Type":"ContainerDied","Data":"8cd5a9106e9e52512d80169f11698ae61a4bf91d5c465581cad51435ea79f5f0"} Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.659757 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szdzh" event={"ID":"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7","Type":"ContainerStarted","Data":"a80d053f641d9078e87590c495599a6fc3e5bc7a87fc9963bea576c8a55b7383"} Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.681854 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd2h6\" (UniqueName: \"kubernetes.io/projected/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-kube-api-access-pd2h6\") pod \"redhat-operators-rjn8v\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.692925 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psjml" event={"ID":"fc8dbf8c-119b-4723-9479-ddad0026dcbd","Type":"ContainerStarted","Data":"d5eb2ba28e7fcb190974ab9d41bd389cf9dd1805fec9bfd4881d5f139f974531"} Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.820167 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qzlm7" Oct 07 08:19:06 crc kubenswrapper[5025]: I1007 08:19:06.821363 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.015252 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.147584 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmcm5"] Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.179904 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.349855 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rjn8v"] Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.472792 5025 patch_prober.go:28] interesting pod/router-default-5444994796-pc6rv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 08:19:07 crc kubenswrapper[5025]: [-]has-synced failed: reason withheld Oct 07 08:19:07 crc kubenswrapper[5025]: [+]process-running ok Oct 07 08:19:07 crc kubenswrapper[5025]: healthz check failed Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.472847 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pc6rv" podUID="16dbc3a0-2245-4de3-ab11-eb599a11f5fb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.724332 5025 generic.go:334] "Generic (PLEG): container finished" podID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerID="5003fc62d9356ac26851ac7b9274a3b2668561e940d4e15335b29bd472277629" exitCode=0 Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.724765 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmcm5" event={"ID":"5e0afdb0-5693-4263-8c52-9cccc4878003","Type":"ContainerDied","Data":"5003fc62d9356ac26851ac7b9274a3b2668561e940d4e15335b29bd472277629"} Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.724791 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmcm5" event={"ID":"5e0afdb0-5693-4263-8c52-9cccc4878003","Type":"ContainerStarted","Data":"c650fb5a134c739a32a27191d95aa682dbb8664ae47d242aced6b792329a0599"} Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.732783 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjn8v" event={"ID":"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab","Type":"ContainerStarted","Data":"e712665d078f4246a597fa0cbd7683d696a31ee686f204f88390c3413caf77dd"} Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.752109 5025 generic.go:334] "Generic (PLEG): container finished" podID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerID="dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38" exitCode=0 Oct 07 08:19:07 crc kubenswrapper[5025]: I1007 08:19:07.752203 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psjml" event={"ID":"fc8dbf8c-119b-4723-9479-ddad0026dcbd","Type":"ContainerDied","Data":"dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38"} Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.109653 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.183494 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/383d53df-cd53-4fe8-86f9-28cff5f20add-kube-api-access\") pod \"383d53df-cd53-4fe8-86f9-28cff5f20add\" (UID: \"383d53df-cd53-4fe8-86f9-28cff5f20add\") " Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.183665 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/383d53df-cd53-4fe8-86f9-28cff5f20add-kubelet-dir\") pod \"383d53df-cd53-4fe8-86f9-28cff5f20add\" (UID: \"383d53df-cd53-4fe8-86f9-28cff5f20add\") " Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.183889 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/383d53df-cd53-4fe8-86f9-28cff5f20add-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "383d53df-cd53-4fe8-86f9-28cff5f20add" (UID: "383d53df-cd53-4fe8-86f9-28cff5f20add"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.184116 5025 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/383d53df-cd53-4fe8-86f9-28cff5f20add-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.203875 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383d53df-cd53-4fe8-86f9-28cff5f20add-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "383d53df-cd53-4fe8-86f9-28cff5f20add" (UID: "383d53df-cd53-4fe8-86f9-28cff5f20add"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.285810 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/383d53df-cd53-4fe8-86f9-28cff5f20add-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.473935 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.478484 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pc6rv" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.784115 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"383d53df-cd53-4fe8-86f9-28cff5f20add","Type":"ContainerDied","Data":"43fa842fc10b1683c62d582c2595949ee344a5f97e35031dce60d80c4733dbd1"} Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.784181 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43fa842fc10b1683c62d582c2595949ee344a5f97e35031dce60d80c4733dbd1" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.784257 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.794985 5025 generic.go:334] "Generic (PLEG): container finished" podID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerID="77e77e38a24e645c9bde00bdba06a79137ef7dee754f3fc3d940d40169333253" exitCode=0 Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.795752 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjn8v" event={"ID":"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab","Type":"ContainerDied","Data":"77e77e38a24e645c9bde00bdba06a79137ef7dee754f3fc3d940d40169333253"} Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.874143 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 08:19:08 crc kubenswrapper[5025]: E1007 08:19:08.874758 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383d53df-cd53-4fe8-86f9-28cff5f20add" containerName="pruner" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.874785 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="383d53df-cd53-4fe8-86f9-28cff5f20add" containerName="pruner" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.875086 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="383d53df-cd53-4fe8-86f9-28cff5f20add" containerName="pruner" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.876080 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.879155 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.879809 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.879990 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.900803 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c60c5e80-ce8e-4756-bea7-17508cfd3434-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c60c5e80-ce8e-4756-bea7-17508cfd3434\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:08 crc kubenswrapper[5025]: I1007 08:19:08.900908 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c60c5e80-ce8e-4756-bea7-17508cfd3434-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c60c5e80-ce8e-4756-bea7-17508cfd3434\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:09 crc kubenswrapper[5025]: I1007 08:19:09.001968 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c60c5e80-ce8e-4756-bea7-17508cfd3434-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c60c5e80-ce8e-4756-bea7-17508cfd3434\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:09 crc kubenswrapper[5025]: I1007 08:19:09.002033 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c60c5e80-ce8e-4756-bea7-17508cfd3434-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c60c5e80-ce8e-4756-bea7-17508cfd3434\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:09 crc kubenswrapper[5025]: I1007 08:19:09.002116 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c60c5e80-ce8e-4756-bea7-17508cfd3434-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c60c5e80-ce8e-4756-bea7-17508cfd3434\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:09 crc kubenswrapper[5025]: I1007 08:19:09.050755 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c60c5e80-ce8e-4756-bea7-17508cfd3434-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c60c5e80-ce8e-4756-bea7-17508cfd3434\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:09 crc kubenswrapper[5025]: I1007 08:19:09.200121 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:09 crc kubenswrapper[5025]: I1007 08:19:09.523929 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 08:19:09 crc kubenswrapper[5025]: I1007 08:19:09.841167 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c60c5e80-ce8e-4756-bea7-17508cfd3434","Type":"ContainerStarted","Data":"3e70d3000ed702309933ace947ba35fbf18b29dcbfc8172bbe250658e707735e"} Oct 07 08:19:10 crc kubenswrapper[5025]: I1007 08:19:10.857874 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c60c5e80-ce8e-4756-bea7-17508cfd3434","Type":"ContainerStarted","Data":"764565cdaf2b5535efa13ccc513117b25aabe78a09401715c369ea16b6e48fa8"} Oct 07 08:19:10 crc kubenswrapper[5025]: I1007 08:19:10.885200 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.885166485 podStartE2EDuration="2.885166485s" podCreationTimestamp="2025-10-07 08:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:10.878443668 +0000 UTC m=+157.687757812" watchObservedRunningTime="2025-10-07 08:19:10.885166485 +0000 UTC m=+157.694480639" Oct 07 08:19:11 crc kubenswrapper[5025]: I1007 08:19:11.896483 5025 generic.go:334] "Generic (PLEG): container finished" podID="c60c5e80-ce8e-4756-bea7-17508cfd3434" containerID="764565cdaf2b5535efa13ccc513117b25aabe78a09401715c369ea16b6e48fa8" exitCode=0 Oct 07 08:19:11 crc kubenswrapper[5025]: I1007 08:19:11.896589 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c60c5e80-ce8e-4756-bea7-17508cfd3434","Type":"ContainerDied","Data":"764565cdaf2b5535efa13ccc513117b25aabe78a09401715c369ea16b6e48fa8"} Oct 07 08:19:11 crc kubenswrapper[5025]: I1007 08:19:11.954816 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-795lz" Oct 07 08:19:16 crc kubenswrapper[5025]: I1007 08:19:16.410507 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:19:16 crc kubenswrapper[5025]: I1007 08:19:16.419692 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:19:16 crc kubenswrapper[5025]: I1007 08:19:16.449523 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hd4xd" Oct 07 08:19:17 crc kubenswrapper[5025]: I1007 08:19:17.063868 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:19:17 crc kubenswrapper[5025]: I1007 08:19:17.072247 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93fdeab4-b5d2-42d8-97ca-d5d61032e19f-metrics-certs\") pod \"network-metrics-daemon-f4ls7\" (UID: \"93fdeab4-b5d2-42d8-97ca-d5d61032e19f\") " pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:19:17 crc kubenswrapper[5025]: I1007 08:19:17.335279 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4ls7" Oct 07 08:19:24 crc kubenswrapper[5025]: I1007 08:19:24.445714 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:24 crc kubenswrapper[5025]: I1007 08:19:24.495054 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c60c5e80-ce8e-4756-bea7-17508cfd3434-kubelet-dir\") pod \"c60c5e80-ce8e-4756-bea7-17508cfd3434\" (UID: \"c60c5e80-ce8e-4756-bea7-17508cfd3434\") " Oct 07 08:19:24 crc kubenswrapper[5025]: I1007 08:19:24.495236 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c60c5e80-ce8e-4756-bea7-17508cfd3434-kube-api-access\") pod \"c60c5e80-ce8e-4756-bea7-17508cfd3434\" (UID: \"c60c5e80-ce8e-4756-bea7-17508cfd3434\") " Oct 07 08:19:24 crc kubenswrapper[5025]: I1007 08:19:24.495230 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c60c5e80-ce8e-4756-bea7-17508cfd3434-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c60c5e80-ce8e-4756-bea7-17508cfd3434" (UID: "c60c5e80-ce8e-4756-bea7-17508cfd3434"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:19:24 crc kubenswrapper[5025]: I1007 08:19:24.495679 5025 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c60c5e80-ce8e-4756-bea7-17508cfd3434-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 08:19:24 crc kubenswrapper[5025]: I1007 08:19:24.501598 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c60c5e80-ce8e-4756-bea7-17508cfd3434-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c60c5e80-ce8e-4756-bea7-17508cfd3434" (UID: "c60c5e80-ce8e-4756-bea7-17508cfd3434"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:19:24 crc kubenswrapper[5025]: I1007 08:19:24.518523 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:19:24 crc kubenswrapper[5025]: I1007 08:19:24.597127 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c60c5e80-ce8e-4756-bea7-17508cfd3434-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 08:19:25 crc kubenswrapper[5025]: I1007 08:19:25.019912 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c60c5e80-ce8e-4756-bea7-17508cfd3434","Type":"ContainerDied","Data":"3e70d3000ed702309933ace947ba35fbf18b29dcbfc8172bbe250658e707735e"} Oct 07 08:19:25 crc kubenswrapper[5025]: I1007 08:19:25.019954 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e70d3000ed702309933ace947ba35fbf18b29dcbfc8172bbe250658e707735e" Oct 07 08:19:25 crc kubenswrapper[5025]: I1007 08:19:25.020083 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 08:19:25 crc kubenswrapper[5025]: I1007 08:19:25.934751 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:19:25 crc kubenswrapper[5025]: I1007 08:19:25.934858 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:19:36 crc kubenswrapper[5025]: I1007 08:19:36.497969 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sjgwg" Oct 07 08:19:38 crc kubenswrapper[5025]: E1007 08:19:38.321436 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 08:19:38 crc kubenswrapper[5025]: E1007 08:19:38.322034 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd2h6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rjn8v_openshift-marketplace(fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 08:19:38 crc kubenswrapper[5025]: E1007 08:19:38.323516 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rjn8v" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" Oct 07 08:19:40 crc kubenswrapper[5025]: E1007 08:19:40.097077 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rjn8v" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" Oct 07 08:19:40 crc kubenswrapper[5025]: E1007 08:19:40.162764 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 08:19:40 crc kubenswrapper[5025]: E1007 08:19:40.163670 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbr8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6kl5d_openshift-marketplace(109518fe-ba3b-4b94-bbf3-64c30e50f253): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 08:19:40 crc kubenswrapper[5025]: E1007 08:19:40.164904 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6kl5d" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" Oct 07 08:19:40 crc kubenswrapper[5025]: E1007 08:19:40.196983 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 08:19:40 crc kubenswrapper[5025]: E1007 08:19:40.197194 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lhd5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dmcm5_openshift-marketplace(5e0afdb0-5693-4263-8c52-9cccc4878003): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 08:19:40 crc kubenswrapper[5025]: E1007 08:19:40.199156 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dmcm5" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" Oct 07 08:19:41 crc kubenswrapper[5025]: E1007 08:19:41.387288 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6kl5d" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" Oct 07 08:19:41 crc kubenswrapper[5025]: E1007 08:19:41.387425 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dmcm5" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" Oct 07 08:19:41 crc kubenswrapper[5025]: E1007 08:19:41.473222 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 08:19:41 crc kubenswrapper[5025]: E1007 08:19:41.473410 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7nzdk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-z7c4f_openshift-marketplace(11ec0e09-2a87-483f-bdac-73b3f3995a8a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 08:19:41 crc kubenswrapper[5025]: E1007 08:19:41.474671 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-z7c4f" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" Oct 07 08:19:41 crc kubenswrapper[5025]: E1007 08:19:41.517504 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 08:19:41 crc kubenswrapper[5025]: E1007 08:19:41.517760 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjbf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lp7dp_openshift-marketplace(77520001-a1f9-4018-bc4c-2964849da6c7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 08:19:41 crc kubenswrapper[5025]: E1007 08:19:41.518926 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lp7dp" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" Oct 07 08:19:41 crc kubenswrapper[5025]: I1007 08:19:41.963384 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 08:19:44 crc kubenswrapper[5025]: E1007 08:19:44.325232 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lp7dp" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" Oct 07 08:19:44 crc kubenswrapper[5025]: E1007 08:19:44.325274 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-z7c4f" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" Oct 07 08:19:44 crc kubenswrapper[5025]: I1007 08:19:44.758285 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f4ls7"] Oct 07 08:19:45 crc kubenswrapper[5025]: E1007 08:19:45.258533 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 08:19:45 crc kubenswrapper[5025]: E1007 08:19:45.259234 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7d2lk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-szdzh_openshift-marketplace(79dcab45-e91e-409e-97a7-9cbe6c4e3fe7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 08:19:45 crc kubenswrapper[5025]: E1007 08:19:45.260672 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-szdzh" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" Oct 07 08:19:45 crc kubenswrapper[5025]: E1007 08:19:45.272394 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 08:19:45 crc kubenswrapper[5025]: E1007 08:19:45.272725 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sz9z4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-psjml_openshift-marketplace(fc8dbf8c-119b-4723-9479-ddad0026dcbd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 08:19:45 crc kubenswrapper[5025]: E1007 08:19:45.274011 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-psjml" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" Oct 07 08:19:46 crc kubenswrapper[5025]: I1007 08:19:46.157525 5025 generic.go:334] "Generic (PLEG): container finished" podID="d30db05d-5f5e-4586-92de-f877c5a64600" containerID="f1ca9f53db187e20d84423cafe4920b0c9de1c83ea0a8ea5b22e5caa1e970e63" exitCode=0 Oct 07 08:19:46 crc kubenswrapper[5025]: I1007 08:19:46.157700 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx6rt" event={"ID":"d30db05d-5f5e-4586-92de-f877c5a64600","Type":"ContainerDied","Data":"f1ca9f53db187e20d84423cafe4920b0c9de1c83ea0a8ea5b22e5caa1e970e63"} Oct 07 08:19:46 crc kubenswrapper[5025]: I1007 08:19:46.162125 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" event={"ID":"93fdeab4-b5d2-42d8-97ca-d5d61032e19f","Type":"ContainerStarted","Data":"380f34e16028ec21b15639ad10cbeb3d6c8f5f9bfecff969b55b4774cda7328c"} Oct 07 08:19:46 crc kubenswrapper[5025]: I1007 08:19:46.162227 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" event={"ID":"93fdeab4-b5d2-42d8-97ca-d5d61032e19f","Type":"ContainerStarted","Data":"d089dd5c0df672dbc3a4fb745f30e7e99ef61767e334bf4ec9a0119175ef7d0d"} Oct 07 08:19:46 crc kubenswrapper[5025]: I1007 08:19:46.162253 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f4ls7" event={"ID":"93fdeab4-b5d2-42d8-97ca-d5d61032e19f","Type":"ContainerStarted","Data":"0ae7d1e730a77c75a63d4c19b6d0c1e17d91be4e79b3d1e5015334851443c8c5"} Oct 07 08:19:46 crc kubenswrapper[5025]: E1007 08:19:46.165266 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-szdzh" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" Oct 07 08:19:46 crc kubenswrapper[5025]: E1007 08:19:46.165284 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-psjml" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" Oct 07 08:19:46 crc kubenswrapper[5025]: I1007 08:19:46.287080 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f4ls7" podStartSLOduration=172.287051163 podStartE2EDuration="2m52.287051163s" podCreationTimestamp="2025-10-07 08:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:19:46.278588511 +0000 UTC m=+193.087902655" watchObservedRunningTime="2025-10-07 08:19:46.287051163 +0000 UTC m=+193.096365307" Oct 07 08:19:47 crc kubenswrapper[5025]: I1007 08:19:47.174567 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx6rt" event={"ID":"d30db05d-5f5e-4586-92de-f877c5a64600","Type":"ContainerStarted","Data":"09060a5fe7d26093e99ed6c54f73bc29e81d5b20322e249ab7e38ba89243c0ef"} Oct 07 08:19:47 crc kubenswrapper[5025]: I1007 08:19:47.214840 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wx6rt" podStartSLOduration=3.096336681 podStartE2EDuration="44.214809065s" podCreationTimestamp="2025-10-07 08:19:03 +0000 UTC" firstStartedPulling="2025-10-07 08:19:05.536151371 +0000 UTC m=+152.345465535" lastFinishedPulling="2025-10-07 08:19:46.654623775 +0000 UTC m=+193.463937919" observedRunningTime="2025-10-07 08:19:47.211065344 +0000 UTC m=+194.020379488" watchObservedRunningTime="2025-10-07 08:19:47.214809065 +0000 UTC m=+194.024123249" Oct 07 08:19:53 crc kubenswrapper[5025]: I1007 08:19:53.473380 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:53 crc kubenswrapper[5025]: I1007 08:19:53.474202 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:54 crc kubenswrapper[5025]: I1007 08:19:54.483689 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:54 crc kubenswrapper[5025]: I1007 08:19:54.533695 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:19:55 crc kubenswrapper[5025]: I1007 08:19:55.934655 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:19:55 crc kubenswrapper[5025]: I1007 08:19:55.935341 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:19:55 crc kubenswrapper[5025]: I1007 08:19:55.935422 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:19:55 crc kubenswrapper[5025]: I1007 08:19:55.936572 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:19:55 crc kubenswrapper[5025]: I1007 08:19:55.936730 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821" gracePeriod=600 Oct 07 08:19:56 crc kubenswrapper[5025]: I1007 08:19:56.246956 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821" exitCode=0 Oct 07 08:19:56 crc kubenswrapper[5025]: I1007 08:19:56.247006 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821"} Oct 07 08:19:57 crc kubenswrapper[5025]: I1007 08:19:57.256660 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kl5d" event={"ID":"109518fe-ba3b-4b94-bbf3-64c30e50f253","Type":"ContainerStarted","Data":"813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42"} Oct 07 08:19:57 crc kubenswrapper[5025]: I1007 08:19:57.260881 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"b2ddfe2fe8e8b2bfab09bd0fa5fc060581707876ce97a518502d1ad38c21cc8d"} Oct 07 08:19:57 crc kubenswrapper[5025]: I1007 08:19:57.265311 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmcm5" event={"ID":"5e0afdb0-5693-4263-8c52-9cccc4878003","Type":"ContainerStarted","Data":"3c5292ace7ebc0aaf2061218233e667a58b84d8a12dd605e9523f1cab9b0c67e"} Oct 07 08:19:57 crc kubenswrapper[5025]: I1007 08:19:57.267842 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjn8v" event={"ID":"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab","Type":"ContainerStarted","Data":"87c7c7a6eea9eb1545ee578977764b4c835af6cde0c0d6ca270ae4b2bd105d40"} Oct 07 08:19:58 crc kubenswrapper[5025]: I1007 08:19:58.277881 5025 generic.go:334] "Generic (PLEG): container finished" podID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerID="74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202" exitCode=0 Oct 07 08:19:58 crc kubenswrapper[5025]: I1007 08:19:58.278351 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psjml" event={"ID":"fc8dbf8c-119b-4723-9479-ddad0026dcbd","Type":"ContainerDied","Data":"74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202"} Oct 07 08:19:58 crc kubenswrapper[5025]: I1007 08:19:58.282481 5025 generic.go:334] "Generic (PLEG): container finished" podID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerID="813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42" exitCode=0 Oct 07 08:19:58 crc kubenswrapper[5025]: I1007 08:19:58.282621 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kl5d" event={"ID":"109518fe-ba3b-4b94-bbf3-64c30e50f253","Type":"ContainerDied","Data":"813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42"} Oct 07 08:19:58 crc kubenswrapper[5025]: I1007 08:19:58.285273 5025 generic.go:334] "Generic (PLEG): container finished" podID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerID="3c5292ace7ebc0aaf2061218233e667a58b84d8a12dd605e9523f1cab9b0c67e" exitCode=0 Oct 07 08:19:58 crc kubenswrapper[5025]: I1007 08:19:58.285356 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmcm5" event={"ID":"5e0afdb0-5693-4263-8c52-9cccc4878003","Type":"ContainerDied","Data":"3c5292ace7ebc0aaf2061218233e667a58b84d8a12dd605e9523f1cab9b0c67e"} Oct 07 08:19:58 crc kubenswrapper[5025]: I1007 08:19:58.293978 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lp7dp" event={"ID":"77520001-a1f9-4018-bc4c-2964849da6c7","Type":"ContainerStarted","Data":"2ab5a472881e02e79700a12b53ed4bb3543c297e8b8ba3348b59523facf5eaf7"} Oct 07 08:19:58 crc kubenswrapper[5025]: I1007 08:19:58.297921 5025 generic.go:334] "Generic (PLEG): container finished" podID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerID="87c7c7a6eea9eb1545ee578977764b4c835af6cde0c0d6ca270ae4b2bd105d40" exitCode=0 Oct 07 08:19:58 crc kubenswrapper[5025]: I1007 08:19:58.299248 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjn8v" event={"ID":"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab","Type":"ContainerDied","Data":"87c7c7a6eea9eb1545ee578977764b4c835af6cde0c0d6ca270ae4b2bd105d40"} Oct 07 08:19:59 crc kubenswrapper[5025]: I1007 08:19:59.329325 5025 generic.go:334] "Generic (PLEG): container finished" podID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerID="229700c9b4b788d7bcbafb5b240c08a2e9481b2d541c992ebb133d29afe990a9" exitCode=0 Oct 07 08:19:59 crc kubenswrapper[5025]: I1007 08:19:59.329447 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7c4f" event={"ID":"11ec0e09-2a87-483f-bdac-73b3f3995a8a","Type":"ContainerDied","Data":"229700c9b4b788d7bcbafb5b240c08a2e9481b2d541c992ebb133d29afe990a9"} Oct 07 08:19:59 crc kubenswrapper[5025]: I1007 08:19:59.337589 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmcm5" event={"ID":"5e0afdb0-5693-4263-8c52-9cccc4878003","Type":"ContainerStarted","Data":"6f90777253ef80e24d5bd4bcf6bbb99872d316a8983be60f39b710a3a9558443"} Oct 07 08:19:59 crc kubenswrapper[5025]: I1007 08:19:59.340393 5025 generic.go:334] "Generic (PLEG): container finished" podID="77520001-a1f9-4018-bc4c-2964849da6c7" containerID="2ab5a472881e02e79700a12b53ed4bb3543c297e8b8ba3348b59523facf5eaf7" exitCode=0 Oct 07 08:19:59 crc kubenswrapper[5025]: I1007 08:19:59.340444 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lp7dp" event={"ID":"77520001-a1f9-4018-bc4c-2964849da6c7","Type":"ContainerDied","Data":"2ab5a472881e02e79700a12b53ed4bb3543c297e8b8ba3348b59523facf5eaf7"} Oct 07 08:19:59 crc kubenswrapper[5025]: I1007 08:19:59.345349 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psjml" event={"ID":"fc8dbf8c-119b-4723-9479-ddad0026dcbd","Type":"ContainerStarted","Data":"6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f"} Oct 07 08:19:59 crc kubenswrapper[5025]: I1007 08:19:59.393437 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-psjml" podStartSLOduration=3.4464974 podStartE2EDuration="54.393419331s" podCreationTimestamp="2025-10-07 08:19:05 +0000 UTC" firstStartedPulling="2025-10-07 08:19:07.75494556 +0000 UTC m=+154.564259704" lastFinishedPulling="2025-10-07 08:19:58.701867461 +0000 UTC m=+205.511181635" observedRunningTime="2025-10-07 08:19:59.391651519 +0000 UTC m=+206.200965663" watchObservedRunningTime="2025-10-07 08:19:59.393419331 +0000 UTC m=+206.202733475" Oct 07 08:19:59 crc kubenswrapper[5025]: I1007 08:19:59.410792 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dmcm5" podStartSLOduration=2.422453833 podStartE2EDuration="53.410768048s" podCreationTimestamp="2025-10-07 08:19:06 +0000 UTC" firstStartedPulling="2025-10-07 08:19:07.726720951 +0000 UTC m=+154.536035095" lastFinishedPulling="2025-10-07 08:19:58.715035136 +0000 UTC m=+205.524349310" observedRunningTime="2025-10-07 08:19:59.409605654 +0000 UTC m=+206.218919798" watchObservedRunningTime="2025-10-07 08:19:59.410768048 +0000 UTC m=+206.220082182" Oct 07 08:20:00 crc kubenswrapper[5025]: I1007 08:20:00.354223 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjn8v" event={"ID":"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab","Type":"ContainerStarted","Data":"0d376c90fe6afa24f786d87bfa854dbaff3ff7d7ff410c5b71ca22152640d7f9"} Oct 07 08:20:00 crc kubenswrapper[5025]: I1007 08:20:00.356909 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kl5d" event={"ID":"109518fe-ba3b-4b94-bbf3-64c30e50f253","Type":"ContainerStarted","Data":"6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234"} Oct 07 08:20:00 crc kubenswrapper[5025]: I1007 08:20:00.359871 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7c4f" event={"ID":"11ec0e09-2a87-483f-bdac-73b3f3995a8a","Type":"ContainerStarted","Data":"d8ca55e8414d00dfb59899e7c9d67450777bf9c99112226f0d72698c1e50e287"} Oct 07 08:20:00 crc kubenswrapper[5025]: I1007 08:20:00.362349 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lp7dp" event={"ID":"77520001-a1f9-4018-bc4c-2964849da6c7","Type":"ContainerStarted","Data":"77dc24d564332e2b13196ea8aa063fec0d7401bd2bd892370631df760cb89e7e"} Oct 07 08:20:00 crc kubenswrapper[5025]: I1007 08:20:00.380835 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rjn8v" podStartSLOduration=3.881250621 podStartE2EDuration="54.380810614s" podCreationTimestamp="2025-10-07 08:19:06 +0000 UTC" firstStartedPulling="2025-10-07 08:19:08.814027913 +0000 UTC m=+155.623342057" lastFinishedPulling="2025-10-07 08:19:59.313587906 +0000 UTC m=+206.122902050" observedRunningTime="2025-10-07 08:20:00.378839716 +0000 UTC m=+207.188153860" watchObservedRunningTime="2025-10-07 08:20:00.380810614 +0000 UTC m=+207.190124758" Oct 07 08:20:00 crc kubenswrapper[5025]: I1007 08:20:00.418799 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6kl5d" podStartSLOduration=3.75889352 podStartE2EDuration="57.418778405s" podCreationTimestamp="2025-10-07 08:19:03 +0000 UTC" firstStartedPulling="2025-10-07 08:19:05.583448265 +0000 UTC m=+152.392762409" lastFinishedPulling="2025-10-07 08:19:59.24333312 +0000 UTC m=+206.052647294" observedRunningTime="2025-10-07 08:20:00.418693653 +0000 UTC m=+207.228007827" watchObservedRunningTime="2025-10-07 08:20:00.418778405 +0000 UTC m=+207.228092549" Oct 07 08:20:00 crc kubenswrapper[5025]: I1007 08:20:00.420413 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z7c4f" podStartSLOduration=2.971667098 podStartE2EDuration="57.420403233s" podCreationTimestamp="2025-10-07 08:19:03 +0000 UTC" firstStartedPulling="2025-10-07 08:19:05.518048728 +0000 UTC m=+152.327362872" lastFinishedPulling="2025-10-07 08:19:59.966784863 +0000 UTC m=+206.776099007" observedRunningTime="2025-10-07 08:20:00.398841832 +0000 UTC m=+207.208155986" watchObservedRunningTime="2025-10-07 08:20:00.420403233 +0000 UTC m=+207.229717377" Oct 07 08:20:00 crc kubenswrapper[5025]: I1007 08:20:00.442619 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lp7dp" podStartSLOduration=4.139366495 podStartE2EDuration="58.442599212s" podCreationTimestamp="2025-10-07 08:19:02 +0000 UTC" firstStartedPulling="2025-10-07 08:19:05.552631372 +0000 UTC m=+152.361945516" lastFinishedPulling="2025-10-07 08:19:59.855864089 +0000 UTC m=+206.665178233" observedRunningTime="2025-10-07 08:20:00.437149063 +0000 UTC m=+207.246463217" watchObservedRunningTime="2025-10-07 08:20:00.442599212 +0000 UTC m=+207.251913356" Oct 07 08:20:01 crc kubenswrapper[5025]: I1007 08:20:01.372826 5025 generic.go:334] "Generic (PLEG): container finished" podID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerID="ce935b63ea833cae4a9fe9ab0132e0c6febaf0a643d3502e0920f27e576b6c37" exitCode=0 Oct 07 08:20:01 crc kubenswrapper[5025]: I1007 08:20:01.373000 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szdzh" event={"ID":"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7","Type":"ContainerDied","Data":"ce935b63ea833cae4a9fe9ab0132e0c6febaf0a643d3502e0920f27e576b6c37"} Oct 07 08:20:02 crc kubenswrapper[5025]: I1007 08:20:02.383514 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szdzh" event={"ID":"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7","Type":"ContainerStarted","Data":"0cfd87fd11d88961a1837ffe88d4c7726d618dab58ff9eea21266a17382572f2"} Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.237021 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.237453 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.296683 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.325273 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-szdzh" podStartSLOduration=3.111876968 podStartE2EDuration="58.325243736s" podCreationTimestamp="2025-10-07 08:19:05 +0000 UTC" firstStartedPulling="2025-10-07 08:19:06.682440874 +0000 UTC m=+153.491755018" lastFinishedPulling="2025-10-07 08:20:01.895807632 +0000 UTC m=+208.705121786" observedRunningTime="2025-10-07 08:20:02.406267864 +0000 UTC m=+209.215582008" watchObservedRunningTime="2025-10-07 08:20:03.325243736 +0000 UTC m=+210.134557900" Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.583332 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.583704 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.625528 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.796073 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.796154 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:20:03 crc kubenswrapper[5025]: I1007 08:20:03.841245 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:20:04 crc kubenswrapper[5025]: I1007 08:20:04.442197 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:20:05 crc kubenswrapper[5025]: I1007 08:20:05.371338 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:20:05 crc kubenswrapper[5025]: I1007 08:20:05.371436 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:20:05 crc kubenswrapper[5025]: I1007 08:20:05.432807 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:20:05 crc kubenswrapper[5025]: I1007 08:20:05.448985 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:20:05 crc kubenswrapper[5025]: I1007 08:20:05.765635 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:20:05 crc kubenswrapper[5025]: I1007 08:20:05.765706 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:20:05 crc kubenswrapper[5025]: I1007 08:20:05.809948 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:20:06 crc kubenswrapper[5025]: I1007 08:20:06.365469 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:20:06 crc kubenswrapper[5025]: I1007 08:20:06.365556 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:20:06 crc kubenswrapper[5025]: I1007 08:20:06.415622 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:20:06 crc kubenswrapper[5025]: I1007 08:20:06.453740 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:20:06 crc kubenswrapper[5025]: I1007 08:20:06.455819 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:20:06 crc kubenswrapper[5025]: I1007 08:20:06.821734 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:20:06 crc kubenswrapper[5025]: I1007 08:20:06.822629 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:20:06 crc kubenswrapper[5025]: I1007 08:20:06.891384 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:20:07 crc kubenswrapper[5025]: I1007 08:20:07.458651 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:20:07 crc kubenswrapper[5025]: I1007 08:20:07.531285 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6kl5d"] Oct 07 08:20:07 crc kubenswrapper[5025]: I1007 08:20:07.531551 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6kl5d" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerName="registry-server" containerID="cri-o://6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234" gracePeriod=2 Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.126906 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z7c4f"] Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.127533 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z7c4f" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerName="registry-server" containerID="cri-o://d8ca55e8414d00dfb59899e7c9d67450777bf9c99112226f0d72698c1e50e287" gracePeriod=2 Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.413048 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.421412 5025 generic.go:334] "Generic (PLEG): container finished" podID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerID="6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234" exitCode=0 Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.421479 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kl5d" event={"ID":"109518fe-ba3b-4b94-bbf3-64c30e50f253","Type":"ContainerDied","Data":"6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234"} Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.421512 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kl5d" event={"ID":"109518fe-ba3b-4b94-bbf3-64c30e50f253","Type":"ContainerDied","Data":"84e5a0982ed17c9c632b6dd00a40ba2e327496337e61f4d60c877852a1721edb"} Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.421529 5025 scope.go:117] "RemoveContainer" containerID="6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.421680 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kl5d" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.443897 5025 generic.go:334] "Generic (PLEG): container finished" podID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerID="d8ca55e8414d00dfb59899e7c9d67450777bf9c99112226f0d72698c1e50e287" exitCode=0 Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.444656 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7c4f" event={"ID":"11ec0e09-2a87-483f-bdac-73b3f3995a8a","Type":"ContainerDied","Data":"d8ca55e8414d00dfb59899e7c9d67450777bf9c99112226f0d72698c1e50e287"} Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.446586 5025 scope.go:117] "RemoveContainer" containerID="813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.468288 5025 scope.go:117] "RemoveContainer" containerID="269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.480586 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.490796 5025 scope.go:117] "RemoveContainer" containerID="6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234" Oct 07 08:20:08 crc kubenswrapper[5025]: E1007 08:20:08.491222 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234\": container with ID starting with 6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234 not found: ID does not exist" containerID="6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.491265 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234"} err="failed to get container status \"6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234\": rpc error: code = NotFound desc = could not find container \"6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234\": container with ID starting with 6185383735bf37cf6924278a9afe975873fd5b6cacc80db63d016512c6258234 not found: ID does not exist" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.491292 5025 scope.go:117] "RemoveContainer" containerID="813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42" Oct 07 08:20:08 crc kubenswrapper[5025]: E1007 08:20:08.491735 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42\": container with ID starting with 813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42 not found: ID does not exist" containerID="813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.491760 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42"} err="failed to get container status \"813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42\": rpc error: code = NotFound desc = could not find container \"813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42\": container with ID starting with 813398c4acd6ed56436eb8346de27977ac0b96ea9d4a32d1074fb8b77e1c0e42 not found: ID does not exist" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.491774 5025 scope.go:117] "RemoveContainer" containerID="269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c" Oct 07 08:20:08 crc kubenswrapper[5025]: E1007 08:20:08.492118 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c\": container with ID starting with 269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c not found: ID does not exist" containerID="269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.492143 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c"} err="failed to get container status \"269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c\": rpc error: code = NotFound desc = could not find container \"269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c\": container with ID starting with 269f00e7d2c23063af3bb5f6e847224e07d1743bf586b055a4885b8700433d2c not found: ID does not exist" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.532839 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-utilities\") pod \"109518fe-ba3b-4b94-bbf3-64c30e50f253\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.532954 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbr8b\" (UniqueName: \"kubernetes.io/projected/109518fe-ba3b-4b94-bbf3-64c30e50f253-kube-api-access-rbr8b\") pod \"109518fe-ba3b-4b94-bbf3-64c30e50f253\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.533012 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-catalog-content\") pod \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.533061 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-catalog-content\") pod \"109518fe-ba3b-4b94-bbf3-64c30e50f253\" (UID: \"109518fe-ba3b-4b94-bbf3-64c30e50f253\") " Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.533103 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-utilities\") pod \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.533143 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nzdk\" (UniqueName: \"kubernetes.io/projected/11ec0e09-2a87-483f-bdac-73b3f3995a8a-kube-api-access-7nzdk\") pod \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\" (UID: \"11ec0e09-2a87-483f-bdac-73b3f3995a8a\") " Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.533796 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-utilities" (OuterVolumeSpecName: "utilities") pod "109518fe-ba3b-4b94-bbf3-64c30e50f253" (UID: "109518fe-ba3b-4b94-bbf3-64c30e50f253"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.533892 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-utilities" (OuterVolumeSpecName: "utilities") pod "11ec0e09-2a87-483f-bdac-73b3f3995a8a" (UID: "11ec0e09-2a87-483f-bdac-73b3f3995a8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.541151 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ec0e09-2a87-483f-bdac-73b3f3995a8a-kube-api-access-7nzdk" (OuterVolumeSpecName: "kube-api-access-7nzdk") pod "11ec0e09-2a87-483f-bdac-73b3f3995a8a" (UID: "11ec0e09-2a87-483f-bdac-73b3f3995a8a"). InnerVolumeSpecName "kube-api-access-7nzdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.541223 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/109518fe-ba3b-4b94-bbf3-64c30e50f253-kube-api-access-rbr8b" (OuterVolumeSpecName: "kube-api-access-rbr8b") pod "109518fe-ba3b-4b94-bbf3-64c30e50f253" (UID: "109518fe-ba3b-4b94-bbf3-64c30e50f253"). InnerVolumeSpecName "kube-api-access-rbr8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.585208 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "109518fe-ba3b-4b94-bbf3-64c30e50f253" (UID: "109518fe-ba3b-4b94-bbf3-64c30e50f253"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.588856 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11ec0e09-2a87-483f-bdac-73b3f3995a8a" (UID: "11ec0e09-2a87-483f-bdac-73b3f3995a8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.634665 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.634716 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbr8b\" (UniqueName: \"kubernetes.io/projected/109518fe-ba3b-4b94-bbf3-64c30e50f253-kube-api-access-rbr8b\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.634733 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.634745 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109518fe-ba3b-4b94-bbf3-64c30e50f253-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.634756 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ec0e09-2a87-483f-bdac-73b3f3995a8a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.634767 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nzdk\" (UniqueName: \"kubernetes.io/projected/11ec0e09-2a87-483f-bdac-73b3f3995a8a-kube-api-access-7nzdk\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.750311 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6kl5d"] Oct 07 08:20:08 crc kubenswrapper[5025]: I1007 08:20:08.757197 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6kl5d"] Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.454792 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7c4f" event={"ID":"11ec0e09-2a87-483f-bdac-73b3f3995a8a","Type":"ContainerDied","Data":"466136cc9952b6baeecc7a17a0698c69f6afc681d69c493a56b8c6d29380ce5c"} Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.454856 5025 scope.go:117] "RemoveContainer" containerID="d8ca55e8414d00dfb59899e7c9d67450777bf9c99112226f0d72698c1e50e287" Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.454863 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z7c4f" Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.477191 5025 scope.go:117] "RemoveContainer" containerID="229700c9b4b788d7bcbafb5b240c08a2e9481b2d541c992ebb133d29afe990a9" Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.491739 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z7c4f"] Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.496429 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z7c4f"] Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.512437 5025 scope.go:117] "RemoveContainer" containerID="10a5787ad42dff5c562ad72d022aea12a8033684ea8e2ae9a80ffda542069aaa" Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.927324 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" path="/var/lib/kubelet/pods/109518fe-ba3b-4b94-bbf3-64c30e50f253/volumes" Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.928111 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" path="/var/lib/kubelet/pods/11ec0e09-2a87-483f-bdac-73b3f3995a8a/volumes" Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.929634 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-psjml"] Oct 07 08:20:09 crc kubenswrapper[5025]: I1007 08:20:09.930740 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-psjml" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerName="registry-server" containerID="cri-o://6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f" gracePeriod=2 Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.291500 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.365534 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-catalog-content\") pod \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.365616 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz9z4\" (UniqueName: \"kubernetes.io/projected/fc8dbf8c-119b-4723-9479-ddad0026dcbd-kube-api-access-sz9z4\") pod \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.365696 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-utilities\") pod \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\" (UID: \"fc8dbf8c-119b-4723-9479-ddad0026dcbd\") " Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.366704 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-utilities" (OuterVolumeSpecName: "utilities") pod "fc8dbf8c-119b-4723-9479-ddad0026dcbd" (UID: "fc8dbf8c-119b-4723-9479-ddad0026dcbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.384655 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8dbf8c-119b-4723-9479-ddad0026dcbd-kube-api-access-sz9z4" (OuterVolumeSpecName: "kube-api-access-sz9z4") pod "fc8dbf8c-119b-4723-9479-ddad0026dcbd" (UID: "fc8dbf8c-119b-4723-9479-ddad0026dcbd"). InnerVolumeSpecName "kube-api-access-sz9z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.392737 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc8dbf8c-119b-4723-9479-ddad0026dcbd" (UID: "fc8dbf8c-119b-4723-9479-ddad0026dcbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.463569 5025 generic.go:334] "Generic (PLEG): container finished" podID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerID="6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f" exitCode=0 Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.463688 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-psjml" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.463686 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psjml" event={"ID":"fc8dbf8c-119b-4723-9479-ddad0026dcbd","Type":"ContainerDied","Data":"6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f"} Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.463810 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-psjml" event={"ID":"fc8dbf8c-119b-4723-9479-ddad0026dcbd","Type":"ContainerDied","Data":"d5eb2ba28e7fcb190974ab9d41bd389cf9dd1805fec9bfd4881d5f139f974531"} Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.463872 5025 scope.go:117] "RemoveContainer" containerID="6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.467122 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.467161 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8dbf8c-119b-4723-9479-ddad0026dcbd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.467180 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz9z4\" (UniqueName: \"kubernetes.io/projected/fc8dbf8c-119b-4723-9479-ddad0026dcbd-kube-api-access-sz9z4\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.481593 5025 scope.go:117] "RemoveContainer" containerID="74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.501185 5025 scope.go:117] "RemoveContainer" containerID="dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.534144 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rjn8v"] Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.534518 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rjn8v" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerName="registry-server" containerID="cri-o://0d376c90fe6afa24f786d87bfa854dbaff3ff7d7ff410c5b71ca22152640d7f9" gracePeriod=2 Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.540432 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-psjml"] Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.544754 5025 scope.go:117] "RemoveContainer" containerID="6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.544799 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-psjml"] Oct 07 08:20:10 crc kubenswrapper[5025]: E1007 08:20:10.546368 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f\": container with ID starting with 6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f not found: ID does not exist" containerID="6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.546424 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f"} err="failed to get container status \"6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f\": rpc error: code = NotFound desc = could not find container \"6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f\": container with ID starting with 6631d7c142923026b32b7e2c192ee7696b607848f9841364218559237c32a47f not found: ID does not exist" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.546462 5025 scope.go:117] "RemoveContainer" containerID="74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202" Oct 07 08:20:10 crc kubenswrapper[5025]: E1007 08:20:10.546956 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202\": container with ID starting with 74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202 not found: ID does not exist" containerID="74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.546984 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202"} err="failed to get container status \"74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202\": rpc error: code = NotFound desc = could not find container \"74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202\": container with ID starting with 74803f8b4ada0eed0fba54ca5e13d93664fd27100d5af177def181c2ae63f202 not found: ID does not exist" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.547002 5025 scope.go:117] "RemoveContainer" containerID="dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38" Oct 07 08:20:10 crc kubenswrapper[5025]: E1007 08:20:10.547604 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38\": container with ID starting with dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38 not found: ID does not exist" containerID="dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38" Oct 07 08:20:10 crc kubenswrapper[5025]: I1007 08:20:10.547655 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38"} err="failed to get container status \"dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38\": rpc error: code = NotFound desc = could not find container \"dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38\": container with ID starting with dee4266aa0428d88c1adce0cbd7a285a1a0dd74a263aa3dfa3fd1c44c323fd38 not found: ID does not exist" Oct 07 08:20:11 crc kubenswrapper[5025]: I1007 08:20:11.922588 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" path="/var/lib/kubelet/pods/fc8dbf8c-119b-4723-9479-ddad0026dcbd/volumes" Oct 07 08:20:12 crc kubenswrapper[5025]: I1007 08:20:12.486438 5025 generic.go:334] "Generic (PLEG): container finished" podID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerID="0d376c90fe6afa24f786d87bfa854dbaff3ff7d7ff410c5b71ca22152640d7f9" exitCode=0 Oct 07 08:20:12 crc kubenswrapper[5025]: I1007 08:20:12.486508 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjn8v" event={"ID":"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab","Type":"ContainerDied","Data":"0d376c90fe6afa24f786d87bfa854dbaff3ff7d7ff410c5b71ca22152640d7f9"} Oct 07 08:20:12 crc kubenswrapper[5025]: I1007 08:20:12.849259 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:20:12 crc kubenswrapper[5025]: I1007 08:20:12.897965 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd2h6\" (UniqueName: \"kubernetes.io/projected/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-kube-api-access-pd2h6\") pod \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " Oct 07 08:20:12 crc kubenswrapper[5025]: I1007 08:20:12.898039 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-catalog-content\") pod \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " Oct 07 08:20:12 crc kubenswrapper[5025]: I1007 08:20:12.898104 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-utilities\") pod \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\" (UID: \"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab\") " Oct 07 08:20:12 crc kubenswrapper[5025]: I1007 08:20:12.899379 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-utilities" (OuterVolumeSpecName: "utilities") pod "fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" (UID: "fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:20:12 crc kubenswrapper[5025]: I1007 08:20:12.905478 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-kube-api-access-pd2h6" (OuterVolumeSpecName: "kube-api-access-pd2h6") pod "fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" (UID: "fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab"). InnerVolumeSpecName "kube-api-access-pd2h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:20:12 crc kubenswrapper[5025]: I1007 08:20:12.999648 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:13 crc kubenswrapper[5025]: I1007 08:20:12.999794 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd2h6\" (UniqueName: \"kubernetes.io/projected/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-kube-api-access-pd2h6\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:13 crc kubenswrapper[5025]: I1007 08:20:13.284758 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:20:13 crc kubenswrapper[5025]: I1007 08:20:13.494343 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjn8v" event={"ID":"fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab","Type":"ContainerDied","Data":"e712665d078f4246a597fa0cbd7683d696a31ee686f204f88390c3413caf77dd"} Oct 07 08:20:13 crc kubenswrapper[5025]: I1007 08:20:13.494400 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjn8v" Oct 07 08:20:13 crc kubenswrapper[5025]: I1007 08:20:13.494408 5025 scope.go:117] "RemoveContainer" containerID="0d376c90fe6afa24f786d87bfa854dbaff3ff7d7ff410c5b71ca22152640d7f9" Oct 07 08:20:13 crc kubenswrapper[5025]: I1007 08:20:13.515085 5025 scope.go:117] "RemoveContainer" containerID="87c7c7a6eea9eb1545ee578977764b4c835af6cde0c0d6ca270ae4b2bd105d40" Oct 07 08:20:13 crc kubenswrapper[5025]: I1007 08:20:13.529218 5025 scope.go:117] "RemoveContainer" containerID="77e77e38a24e645c9bde00bdba06a79137ef7dee754f3fc3d940d40169333253" Oct 07 08:20:13 crc kubenswrapper[5025]: I1007 08:20:13.894908 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" (UID: "fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:20:13 crc kubenswrapper[5025]: I1007 08:20:13.910509 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:14 crc kubenswrapper[5025]: I1007 08:20:14.119847 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rjn8v"] Oct 07 08:20:14 crc kubenswrapper[5025]: I1007 08:20:14.123926 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rjn8v"] Oct 07 08:20:15 crc kubenswrapper[5025]: I1007 08:20:15.426076 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:20:15 crc kubenswrapper[5025]: I1007 08:20:15.923269 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" path="/var/lib/kubelet/pods/fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab/volumes" Oct 07 08:20:16 crc kubenswrapper[5025]: I1007 08:20:16.010161 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpfzj"] Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.053635 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" podUID="151c8409-e9e6-4a48-8e3d-661c0498cd86" containerName="oauth-openshift" containerID="cri-o://9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c" gracePeriod=15 Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.508034 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546351 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5477954dc8-nwrz7"] Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546703 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerName="extract-utilities" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546723 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerName="extract-utilities" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546737 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546746 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546761 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151c8409-e9e6-4a48-8e3d-661c0498cd86" containerName="oauth-openshift" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546771 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="151c8409-e9e6-4a48-8e3d-661c0498cd86" containerName="oauth-openshift" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546787 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerName="extract-utilities" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546797 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerName="extract-utilities" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546807 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerName="extract-content" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546814 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerName="extract-content" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546825 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546833 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546849 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerName="extract-content" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546856 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerName="extract-content" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546869 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c60c5e80-ce8e-4756-bea7-17508cfd3434" containerName="pruner" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546877 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c60c5e80-ce8e-4756-bea7-17508cfd3434" containerName="pruner" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546890 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerName="extract-utilities" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546904 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerName="extract-utilities" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546935 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerName="extract-content" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546944 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerName="extract-content" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546961 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerName="extract-content" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546969 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerName="extract-content" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.546983 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerName="extract-utilities" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.546992 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerName="extract-utilities" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.547001 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.547009 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.547020 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.547028 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.547171 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ec0e09-2a87-483f-bdac-73b3f3995a8a" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.547190 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="109518fe-ba3b-4b94-bbf3-64c30e50f253" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.547202 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf6dc54-0667-43c9-aa9b-214fd6f3c1ab" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.547216 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8dbf8c-119b-4723-9479-ddad0026dcbd" containerName="registry-server" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.547228 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c60c5e80-ce8e-4756-bea7-17508cfd3434" containerName="pruner" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.547240 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="151c8409-e9e6-4a48-8e3d-661c0498cd86" containerName="oauth-openshift" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.547828 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.554911 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5477954dc8-nwrz7"] Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623219 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-cliconfig\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623336 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-policies\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623466 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-error\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623503 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-idp-0-file-data\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623525 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-dir\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623561 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-service-ca\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623593 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-router-certs\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623627 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-ocp-branding-template\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623710 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rsr4\" (UniqueName: \"kubernetes.io/projected/151c8409-e9e6-4a48-8e3d-661c0498cd86-kube-api-access-4rsr4\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623752 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-login\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623786 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-provider-selection\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623819 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-serving-cert\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623868 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-trusted-ca-bundle\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.623943 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-session\") pod \"151c8409-e9e6-4a48-8e3d-661c0498cd86\" (UID: \"151c8409-e9e6-4a48-8e3d-661c0498cd86\") " Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.624171 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.624217 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-session\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.624250 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-template-login\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.624269 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-template-error\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.624434 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.624289 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625103 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-audit-policies\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625140 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625128 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625170 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625209 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0c3907d9-a9a2-4906-b88e-63b75ea485bf-audit-dir\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625233 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625263 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625293 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtr5k\" (UniqueName: \"kubernetes.io/projected/0c3907d9-a9a2-4906-b88e-63b75ea485bf-kube-api-access-gtr5k\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625323 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625350 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625400 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625414 5025 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625406 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.625716 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.626070 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.630488 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.631881 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151c8409-e9e6-4a48-8e3d-661c0498cd86-kube-api-access-4rsr4" (OuterVolumeSpecName: "kube-api-access-4rsr4") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "kube-api-access-4rsr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.631951 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.632566 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.633014 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.633363 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.633503 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.633754 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.634071 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "151c8409-e9e6-4a48-8e3d-661c0498cd86" (UID: "151c8409-e9e6-4a48-8e3d-661c0498cd86"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.694778 5025 generic.go:334] "Generic (PLEG): container finished" podID="151c8409-e9e6-4a48-8e3d-661c0498cd86" containerID="9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c" exitCode=0 Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.694849 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" event={"ID":"151c8409-e9e6-4a48-8e3d-661c0498cd86","Type":"ContainerDied","Data":"9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c"} Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.694878 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.694924 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpfzj" event={"ID":"151c8409-e9e6-4a48-8e3d-661c0498cd86","Type":"ContainerDied","Data":"e51ef67a7cc46a6413284aff020c11694db3291d6713669580258ce5692234da"} Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.694955 5025 scope.go:117] "RemoveContainer" containerID="9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728022 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-session\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728624 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-template-login\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728648 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-template-error\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728681 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728708 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-audit-policies\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728739 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728763 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728798 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0c3907d9-a9a2-4906-b88e-63b75ea485bf-audit-dir\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728822 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728846 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728872 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtr5k\" (UniqueName: \"kubernetes.io/projected/0c3907d9-a9a2-4906-b88e-63b75ea485bf-kube-api-access-gtr5k\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728892 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728923 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.728971 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729026 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729042 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729055 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729069 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729084 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729097 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rsr4\" (UniqueName: \"kubernetes.io/projected/151c8409-e9e6-4a48-8e3d-661c0498cd86-kube-api-access-4rsr4\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729110 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729126 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729140 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729153 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729169 5025 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/151c8409-e9e6-4a48-8e3d-661c0498cd86-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.729183 5025 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/151c8409-e9e6-4a48-8e3d-661c0498cd86-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.730358 5025 scope.go:117] "RemoveContainer" containerID="9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.733499 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0c3907d9-a9a2-4906-b88e-63b75ea485bf-audit-dir\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: E1007 08:20:41.733501 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c\": container with ID starting with 9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c not found: ID does not exist" containerID="9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.733644 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c"} err="failed to get container status \"9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c\": rpc error: code = NotFound desc = could not find container \"9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c\": container with ID starting with 9a86a9b8d1b079ba9dd05c4cf27c3900ebcc98703fd5e261fddd62d623cc153c not found: ID does not exist" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.734202 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-audit-policies\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.734659 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.734959 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.735447 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.735992 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-session\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.736096 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-template-error\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.736114 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.736153 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-template-login\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.737415 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.737593 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpfzj"] Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.738028 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.738827 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.740255 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0c3907d9-a9a2-4906-b88e-63b75ea485bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.741130 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpfzj"] Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.750593 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtr5k\" (UniqueName: \"kubernetes.io/projected/0c3907d9-a9a2-4906-b88e-63b75ea485bf-kube-api-access-gtr5k\") pod \"oauth-openshift-5477954dc8-nwrz7\" (UID: \"0c3907d9-a9a2-4906-b88e-63b75ea485bf\") " pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.872993 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:41 crc kubenswrapper[5025]: I1007 08:20:41.926299 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151c8409-e9e6-4a48-8e3d-661c0498cd86" path="/var/lib/kubelet/pods/151c8409-e9e6-4a48-8e3d-661c0498cd86/volumes" Oct 07 08:20:42 crc kubenswrapper[5025]: I1007 08:20:42.173511 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5477954dc8-nwrz7"] Oct 07 08:20:42 crc kubenswrapper[5025]: I1007 08:20:42.703681 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" event={"ID":"0c3907d9-a9a2-4906-b88e-63b75ea485bf","Type":"ContainerStarted","Data":"d9991fc539eb627c883618973d4ffd734bb483064802eb58db09802b8128fa95"} Oct 07 08:20:42 crc kubenswrapper[5025]: I1007 08:20:42.703772 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" event={"ID":"0c3907d9-a9a2-4906-b88e-63b75ea485bf","Type":"ContainerStarted","Data":"f9bceccc258114133e3f402e100a0b87571cf2099171517f3d96f7722354dd83"} Oct 07 08:20:42 crc kubenswrapper[5025]: I1007 08:20:42.703893 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:20:42 crc kubenswrapper[5025]: I1007 08:20:42.728954 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" podStartSLOduration=26.728927272 podStartE2EDuration="26.728927272s" podCreationTimestamp="2025-10-07 08:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:20:42.727605674 +0000 UTC m=+249.536919828" watchObservedRunningTime="2025-10-07 08:20:42.728927272 +0000 UTC m=+249.538241416" Oct 07 08:20:42 crc kubenswrapper[5025]: I1007 08:20:42.861189 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5477954dc8-nwrz7" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.603690 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wx6rt"] Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.604588 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wx6rt" podUID="d30db05d-5f5e-4586-92de-f877c5a64600" containerName="registry-server" containerID="cri-o://09060a5fe7d26093e99ed6c54f73bc29e81d5b20322e249ab7e38ba89243c0ef" gracePeriod=30 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.617244 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lp7dp"] Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.623817 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvr8c"] Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.624286 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" podUID="b3110e75-2479-4c2b-b96b-0f41a1f10cec" containerName="marketplace-operator" containerID="cri-o://71a91b22bfc8efb92eab949a6853e02f3737245d3797041ebea82d509f4b7c45" gracePeriod=30 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.627059 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lp7dp" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" containerName="registry-server" containerID="cri-o://77dc24d564332e2b13196ea8aa063fec0d7401bd2bd892370631df760cb89e7e" gracePeriod=30 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.639245 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szdzh"] Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.639561 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-szdzh" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerName="registry-server" containerID="cri-o://0cfd87fd11d88961a1837ffe88d4c7726d618dab58ff9eea21266a17382572f2" gracePeriod=30 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.644720 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dmcm5"] Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.645080 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dmcm5" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerName="registry-server" containerID="cri-o://6f90777253ef80e24d5bd4bcf6bbb99872d316a8983be60f39b710a3a9558443" gracePeriod=30 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.656399 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4bw5"] Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.657336 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.673718 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4bw5"] Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.779668 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxrj2\" (UniqueName: \"kubernetes.io/projected/d57da756-f579-4d68-b775-8788fad75582-kube-api-access-pxrj2\") pod \"marketplace-operator-79b997595-d4bw5\" (UID: \"d57da756-f579-4d68-b775-8788fad75582\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.779857 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d57da756-f579-4d68-b775-8788fad75582-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4bw5\" (UID: \"d57da756-f579-4d68-b775-8788fad75582\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.780431 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d57da756-f579-4d68-b775-8788fad75582-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4bw5\" (UID: \"d57da756-f579-4d68-b775-8788fad75582\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.862996 5025 generic.go:334] "Generic (PLEG): container finished" podID="d30db05d-5f5e-4586-92de-f877c5a64600" containerID="09060a5fe7d26093e99ed6c54f73bc29e81d5b20322e249ab7e38ba89243c0ef" exitCode=0 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.863158 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx6rt" event={"ID":"d30db05d-5f5e-4586-92de-f877c5a64600","Type":"ContainerDied","Data":"09060a5fe7d26093e99ed6c54f73bc29e81d5b20322e249ab7e38ba89243c0ef"} Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.866412 5025 generic.go:334] "Generic (PLEG): container finished" podID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerID="6f90777253ef80e24d5bd4bcf6bbb99872d316a8983be60f39b710a3a9558443" exitCode=0 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.866460 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmcm5" event={"ID":"5e0afdb0-5693-4263-8c52-9cccc4878003","Type":"ContainerDied","Data":"6f90777253ef80e24d5bd4bcf6bbb99872d316a8983be60f39b710a3a9558443"} Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.868924 5025 generic.go:334] "Generic (PLEG): container finished" podID="77520001-a1f9-4018-bc4c-2964849da6c7" containerID="77dc24d564332e2b13196ea8aa063fec0d7401bd2bd892370631df760cb89e7e" exitCode=0 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.868993 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lp7dp" event={"ID":"77520001-a1f9-4018-bc4c-2964849da6c7","Type":"ContainerDied","Data":"77dc24d564332e2b13196ea8aa063fec0d7401bd2bd892370631df760cb89e7e"} Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.872316 5025 generic.go:334] "Generic (PLEG): container finished" podID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerID="0cfd87fd11d88961a1837ffe88d4c7726d618dab58ff9eea21266a17382572f2" exitCode=0 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.872381 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szdzh" event={"ID":"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7","Type":"ContainerDied","Data":"0cfd87fd11d88961a1837ffe88d4c7726d618dab58ff9eea21266a17382572f2"} Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.873765 5025 generic.go:334] "Generic (PLEG): container finished" podID="b3110e75-2479-4c2b-b96b-0f41a1f10cec" containerID="71a91b22bfc8efb92eab949a6853e02f3737245d3797041ebea82d509f4b7c45" exitCode=0 Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.873793 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" event={"ID":"b3110e75-2479-4c2b-b96b-0f41a1f10cec","Type":"ContainerDied","Data":"71a91b22bfc8efb92eab949a6853e02f3737245d3797041ebea82d509f4b7c45"} Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.883593 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d57da756-f579-4d68-b775-8788fad75582-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4bw5\" (UID: \"d57da756-f579-4d68-b775-8788fad75582\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.883639 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d57da756-f579-4d68-b775-8788fad75582-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4bw5\" (UID: \"d57da756-f579-4d68-b775-8788fad75582\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.883703 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxrj2\" (UniqueName: \"kubernetes.io/projected/d57da756-f579-4d68-b775-8788fad75582-kube-api-access-pxrj2\") pod \"marketplace-operator-79b997595-d4bw5\" (UID: \"d57da756-f579-4d68-b775-8788fad75582\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.885730 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d57da756-f579-4d68-b775-8788fad75582-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4bw5\" (UID: \"d57da756-f579-4d68-b775-8788fad75582\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.891381 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d57da756-f579-4d68-b775-8788fad75582-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4bw5\" (UID: \"d57da756-f579-4d68-b775-8788fad75582\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:03 crc kubenswrapper[5025]: I1007 08:21:03.903978 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxrj2\" (UniqueName: \"kubernetes.io/projected/d57da756-f579-4d68-b775-8788fad75582-kube-api-access-pxrj2\") pod \"marketplace-operator-79b997595-d4bw5\" (UID: \"d57da756-f579-4d68-b775-8788fad75582\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.045109 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.056435 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.056692 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.156133 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.160150 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.176612 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.188979 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-utilities\") pod \"d30db05d-5f5e-4586-92de-f877c5a64600\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.189039 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm4wb\" (UniqueName: \"kubernetes.io/projected/b3110e75-2479-4c2b-b96b-0f41a1f10cec-kube-api-access-gm4wb\") pod \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.189057 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-catalog-content\") pod \"d30db05d-5f5e-4586-92de-f877c5a64600\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.189080 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-trusted-ca\") pod \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.189126 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn2n7\" (UniqueName: \"kubernetes.io/projected/d30db05d-5f5e-4586-92de-f877c5a64600-kube-api-access-fn2n7\") pod \"d30db05d-5f5e-4586-92de-f877c5a64600\" (UID: \"d30db05d-5f5e-4586-92de-f877c5a64600\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.189175 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-operator-metrics\") pod \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\" (UID: \"b3110e75-2479-4c2b-b96b-0f41a1f10cec\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.190390 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b3110e75-2479-4c2b-b96b-0f41a1f10cec" (UID: "b3110e75-2479-4c2b-b96b-0f41a1f10cec"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.191390 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-utilities" (OuterVolumeSpecName: "utilities") pod "d30db05d-5f5e-4586-92de-f877c5a64600" (UID: "d30db05d-5f5e-4586-92de-f877c5a64600"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.196745 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b3110e75-2479-4c2b-b96b-0f41a1f10cec" (UID: "b3110e75-2479-4c2b-b96b-0f41a1f10cec"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.201623 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3110e75-2479-4c2b-b96b-0f41a1f10cec-kube-api-access-gm4wb" (OuterVolumeSpecName: "kube-api-access-gm4wb") pod "b3110e75-2479-4c2b-b96b-0f41a1f10cec" (UID: "b3110e75-2479-4c2b-b96b-0f41a1f10cec"). InnerVolumeSpecName "kube-api-access-gm4wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.204728 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30db05d-5f5e-4586-92de-f877c5a64600-kube-api-access-fn2n7" (OuterVolumeSpecName: "kube-api-access-fn2n7") pod "d30db05d-5f5e-4586-92de-f877c5a64600" (UID: "d30db05d-5f5e-4586-92de-f877c5a64600"). InnerVolumeSpecName "kube-api-access-fn2n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.290624 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d2lk\" (UniqueName: \"kubernetes.io/projected/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-kube-api-access-7d2lk\") pod \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.291297 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhd5d\" (UniqueName: \"kubernetes.io/projected/5e0afdb0-5693-4263-8c52-9cccc4878003-kube-api-access-lhd5d\") pod \"5e0afdb0-5693-4263-8c52-9cccc4878003\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.291443 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-catalog-content\") pod \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.291762 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-utilities\") pod \"77520001-a1f9-4018-bc4c-2964849da6c7\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.291961 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-utilities\") pod \"5e0afdb0-5693-4263-8c52-9cccc4878003\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.292047 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-catalog-content\") pod \"5e0afdb0-5693-4263-8c52-9cccc4878003\" (UID: \"5e0afdb0-5693-4263-8c52-9cccc4878003\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.292129 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjbf8\" (UniqueName: \"kubernetes.io/projected/77520001-a1f9-4018-bc4c-2964849da6c7-kube-api-access-qjbf8\") pod \"77520001-a1f9-4018-bc4c-2964849da6c7\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.292241 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-utilities\") pod \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\" (UID: \"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.292331 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-catalog-content\") pod \"77520001-a1f9-4018-bc4c-2964849da6c7\" (UID: \"77520001-a1f9-4018-bc4c-2964849da6c7\") " Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.292671 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm4wb\" (UniqueName: \"kubernetes.io/projected/b3110e75-2479-4c2b-b96b-0f41a1f10cec-kube-api-access-gm4wb\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.292754 5025 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.292838 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn2n7\" (UniqueName: \"kubernetes.io/projected/d30db05d-5f5e-4586-92de-f877c5a64600-kube-api-access-fn2n7\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.292897 5025 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3110e75-2479-4c2b-b96b-0f41a1f10cec-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.292961 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.294183 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d30db05d-5f5e-4586-92de-f877c5a64600" (UID: "d30db05d-5f5e-4586-92de-f877c5a64600"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.299349 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-kube-api-access-7d2lk" (OuterVolumeSpecName: "kube-api-access-7d2lk") pod "79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" (UID: "79dcab45-e91e-409e-97a7-9cbe6c4e3fe7"). InnerVolumeSpecName "kube-api-access-7d2lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.300276 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-utilities" (OuterVolumeSpecName: "utilities") pod "79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" (UID: "79dcab45-e91e-409e-97a7-9cbe6c4e3fe7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.301112 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-utilities" (OuterVolumeSpecName: "utilities") pod "77520001-a1f9-4018-bc4c-2964849da6c7" (UID: "77520001-a1f9-4018-bc4c-2964849da6c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.318379 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-utilities" (OuterVolumeSpecName: "utilities") pod "5e0afdb0-5693-4263-8c52-9cccc4878003" (UID: "5e0afdb0-5693-4263-8c52-9cccc4878003"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.320567 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77520001-a1f9-4018-bc4c-2964849da6c7-kube-api-access-qjbf8" (OuterVolumeSpecName: "kube-api-access-qjbf8") pod "77520001-a1f9-4018-bc4c-2964849da6c7" (UID: "77520001-a1f9-4018-bc4c-2964849da6c7"). InnerVolumeSpecName "kube-api-access-qjbf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.322588 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0afdb0-5693-4263-8c52-9cccc4878003-kube-api-access-lhd5d" (OuterVolumeSpecName: "kube-api-access-lhd5d") pod "5e0afdb0-5693-4263-8c52-9cccc4878003" (UID: "5e0afdb0-5693-4263-8c52-9cccc4878003"). InnerVolumeSpecName "kube-api-access-lhd5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.324756 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" (UID: "79dcab45-e91e-409e-97a7-9cbe6c4e3fe7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.363714 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77520001-a1f9-4018-bc4c-2964849da6c7" (UID: "77520001-a1f9-4018-bc4c-2964849da6c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.396235 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d2lk\" (UniqueName: \"kubernetes.io/projected/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-kube-api-access-7d2lk\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.396556 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhd5d\" (UniqueName: \"kubernetes.io/projected/5e0afdb0-5693-4263-8c52-9cccc4878003-kube-api-access-lhd5d\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.396622 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.396693 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.396750 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30db05d-5f5e-4586-92de-f877c5a64600-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.396813 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.396872 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjbf8\" (UniqueName: \"kubernetes.io/projected/77520001-a1f9-4018-bc4c-2964849da6c7-kube-api-access-qjbf8\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.396926 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.396977 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77520001-a1f9-4018-bc4c-2964849da6c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.411082 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e0afdb0-5693-4263-8c52-9cccc4878003" (UID: "5e0afdb0-5693-4263-8c52-9cccc4878003"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.498485 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e0afdb0-5693-4263-8c52-9cccc4878003-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.685716 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4bw5"] Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.881129 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lp7dp" event={"ID":"77520001-a1f9-4018-bc4c-2964849da6c7","Type":"ContainerDied","Data":"398072bc39083a99c2c33079697d0ac157e44ac072c3c8c178f0d27f714341ef"} Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.881178 5025 scope.go:117] "RemoveContainer" containerID="77dc24d564332e2b13196ea8aa063fec0d7401bd2bd892370631df760cb89e7e" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.881263 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lp7dp" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.885091 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szdzh" event={"ID":"79dcab45-e91e-409e-97a7-9cbe6c4e3fe7","Type":"ContainerDied","Data":"a80d053f641d9078e87590c495599a6fc3e5bc7a87fc9963bea576c8a55b7383"} Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.885161 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szdzh" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.887523 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" event={"ID":"b3110e75-2479-4c2b-b96b-0f41a1f10cec","Type":"ContainerDied","Data":"0c5c4416c983a5db6065c48873853936c3a1a77933de02ab95e21a090c5d5c3c"} Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.887640 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rvr8c" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.893708 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" event={"ID":"d57da756-f579-4d68-b775-8788fad75582","Type":"ContainerStarted","Data":"3618b77177edc838f8272e96f277c9d64bb9700dba1242a9dbf3d60cf4d0dae7"} Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.897391 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx6rt" event={"ID":"d30db05d-5f5e-4586-92de-f877c5a64600","Type":"ContainerDied","Data":"16c67fff3aca66eb11c72f69a15c5412a76863c60d399a89304ed2504f02a948"} Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.897451 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx6rt" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.899698 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmcm5" event={"ID":"5e0afdb0-5693-4263-8c52-9cccc4878003","Type":"ContainerDied","Data":"c650fb5a134c739a32a27191d95aa682dbb8664ae47d242aced6b792329a0599"} Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.899847 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmcm5" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.901462 5025 scope.go:117] "RemoveContainer" containerID="2ab5a472881e02e79700a12b53ed4bb3543c297e8b8ba3348b59523facf5eaf7" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.921515 5025 scope.go:117] "RemoveContainer" containerID="c507095dfe58b2e1197766ac61495d7d31346c9967c184017f68289596542ec0" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.950573 5025 scope.go:117] "RemoveContainer" containerID="0cfd87fd11d88961a1837ffe88d4c7726d618dab58ff9eea21266a17382572f2" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.954041 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvr8c"] Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.961574 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvr8c"] Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.976871 5025 scope.go:117] "RemoveContainer" containerID="ce935b63ea833cae4a9fe9ab0132e0c6febaf0a643d3502e0920f27e576b6c37" Oct 07 08:21:04 crc kubenswrapper[5025]: I1007 08:21:04.997326 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szdzh"] Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.000967 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-szdzh"] Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.004580 5025 scope.go:117] "RemoveContainer" containerID="8cd5a9106e9e52512d80169f11698ae61a4bf91d5c465581cad51435ea79f5f0" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.007865 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wx6rt"] Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.019781 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wx6rt"] Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.028028 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lp7dp"] Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.032042 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lp7dp"] Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.040395 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dmcm5"] Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.042102 5025 scope.go:117] "RemoveContainer" containerID="71a91b22bfc8efb92eab949a6853e02f3737245d3797041ebea82d509f4b7c45" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.046010 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dmcm5"] Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.059030 5025 scope.go:117] "RemoveContainer" containerID="09060a5fe7d26093e99ed6c54f73bc29e81d5b20322e249ab7e38ba89243c0ef" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.077139 5025 scope.go:117] "RemoveContainer" containerID="f1ca9f53db187e20d84423cafe4920b0c9de1c83ea0a8ea5b22e5caa1e970e63" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.099734 5025 scope.go:117] "RemoveContainer" containerID="4d447b20ce9ff07e37952ecb03ecf52cda11e2da443d5764543a10055d5a4ca1" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.119161 5025 scope.go:117] "RemoveContainer" containerID="6f90777253ef80e24d5bd4bcf6bbb99872d316a8983be60f39b710a3a9558443" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.134651 5025 scope.go:117] "RemoveContainer" containerID="3c5292ace7ebc0aaf2061218233e667a58b84d8a12dd605e9523f1cab9b0c67e" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.151131 5025 scope.go:117] "RemoveContainer" containerID="5003fc62d9356ac26851ac7b9274a3b2668561e940d4e15335b29bd472277629" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626457 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ftlxd"] Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626668 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30db05d-5f5e-4586-92de-f877c5a64600" containerName="extract-utilities" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626681 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30db05d-5f5e-4586-92de-f877c5a64600" containerName="extract-utilities" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626690 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626697 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626704 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerName="extract-content" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626710 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerName="extract-content" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626718 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" containerName="extract-content" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626723 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" containerName="extract-content" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626730 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626736 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626745 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30db05d-5f5e-4586-92de-f877c5a64600" containerName="extract-content" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626752 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30db05d-5f5e-4586-92de-f877c5a64600" containerName="extract-content" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626761 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3110e75-2479-4c2b-b96b-0f41a1f10cec" containerName="marketplace-operator" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626768 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3110e75-2479-4c2b-b96b-0f41a1f10cec" containerName="marketplace-operator" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626778 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626785 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626794 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerName="extract-content" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626800 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerName="extract-content" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626810 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerName="extract-utilities" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626817 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerName="extract-utilities" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626827 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30db05d-5f5e-4586-92de-f877c5a64600" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626834 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30db05d-5f5e-4586-92de-f877c5a64600" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626842 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" containerName="extract-utilities" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626849 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" containerName="extract-utilities" Oct 07 08:21:05 crc kubenswrapper[5025]: E1007 08:21:05.626857 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerName="extract-utilities" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626864 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerName="extract-utilities" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626963 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626974 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626982 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626989 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3110e75-2479-4c2b-b96b-0f41a1f10cec" containerName="marketplace-operator" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.626996 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30db05d-5f5e-4586-92de-f877c5a64600" containerName="registry-server" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.627711 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.630063 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.642569 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftlxd"] Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.716349 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-catalog-content\") pod \"certified-operators-ftlxd\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.716398 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvvhh\" (UniqueName: \"kubernetes.io/projected/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-kube-api-access-hvvhh\") pod \"certified-operators-ftlxd\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.716471 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-utilities\") pod \"certified-operators-ftlxd\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.818130 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-catalog-content\") pod \"certified-operators-ftlxd\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.818530 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvvhh\" (UniqueName: \"kubernetes.io/projected/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-kube-api-access-hvvhh\") pod \"certified-operators-ftlxd\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.818726 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-utilities\") pod \"certified-operators-ftlxd\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.818910 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-catalog-content\") pod \"certified-operators-ftlxd\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.819298 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-utilities\") pod \"certified-operators-ftlxd\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.871156 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvvhh\" (UniqueName: \"kubernetes.io/projected/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-kube-api-access-hvvhh\") pod \"certified-operators-ftlxd\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.909801 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" event={"ID":"d57da756-f579-4d68-b775-8788fad75582","Type":"ContainerStarted","Data":"e459d5e4350a4b1893c8630847ea8207d086e75c2be100d20eb3519d3e1b18b0"} Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.910018 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.913083 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.923280 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0afdb0-5693-4263-8c52-9cccc4878003" path="/var/lib/kubelet/pods/5e0afdb0-5693-4263-8c52-9cccc4878003/volumes" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.924828 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77520001-a1f9-4018-bc4c-2964849da6c7" path="/var/lib/kubelet/pods/77520001-a1f9-4018-bc4c-2964849da6c7/volumes" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.925454 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79dcab45-e91e-409e-97a7-9cbe6c4e3fe7" path="/var/lib/kubelet/pods/79dcab45-e91e-409e-97a7-9cbe6c4e3fe7/volumes" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.926633 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3110e75-2479-4c2b-b96b-0f41a1f10cec" path="/var/lib/kubelet/pods/b3110e75-2479-4c2b-b96b-0f41a1f10cec/volumes" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.927108 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30db05d-5f5e-4586-92de-f877c5a64600" path="/var/lib/kubelet/pods/d30db05d-5f5e-4586-92de-f877c5a64600/volumes" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.932508 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-d4bw5" podStartSLOduration=2.9324926639999997 podStartE2EDuration="2.932492664s" podCreationTimestamp="2025-10-07 08:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:21:05.929032764 +0000 UTC m=+272.738346908" watchObservedRunningTime="2025-10-07 08:21:05.932492664 +0000 UTC m=+272.741806808" Oct 07 08:21:05 crc kubenswrapper[5025]: I1007 08:21:05.955448 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.151925 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftlxd"] Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.226223 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wjv4n"] Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.227959 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.230720 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.231414 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjv4n"] Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.324374 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbcm5\" (UniqueName: \"kubernetes.io/projected/61e2aa02-6590-49e6-a1e9-f9e22e01a679-kube-api-access-cbcm5\") pod \"redhat-marketplace-wjv4n\" (UID: \"61e2aa02-6590-49e6-a1e9-f9e22e01a679\") " pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.324851 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61e2aa02-6590-49e6-a1e9-f9e22e01a679-utilities\") pod \"redhat-marketplace-wjv4n\" (UID: \"61e2aa02-6590-49e6-a1e9-f9e22e01a679\") " pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.324919 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61e2aa02-6590-49e6-a1e9-f9e22e01a679-catalog-content\") pod \"redhat-marketplace-wjv4n\" (UID: \"61e2aa02-6590-49e6-a1e9-f9e22e01a679\") " pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.427634 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61e2aa02-6590-49e6-a1e9-f9e22e01a679-utilities\") pod \"redhat-marketplace-wjv4n\" (UID: \"61e2aa02-6590-49e6-a1e9-f9e22e01a679\") " pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.427712 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61e2aa02-6590-49e6-a1e9-f9e22e01a679-catalog-content\") pod \"redhat-marketplace-wjv4n\" (UID: \"61e2aa02-6590-49e6-a1e9-f9e22e01a679\") " pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.427765 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbcm5\" (UniqueName: \"kubernetes.io/projected/61e2aa02-6590-49e6-a1e9-f9e22e01a679-kube-api-access-cbcm5\") pod \"redhat-marketplace-wjv4n\" (UID: \"61e2aa02-6590-49e6-a1e9-f9e22e01a679\") " pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.428447 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61e2aa02-6590-49e6-a1e9-f9e22e01a679-utilities\") pod \"redhat-marketplace-wjv4n\" (UID: \"61e2aa02-6590-49e6-a1e9-f9e22e01a679\") " pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.429738 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61e2aa02-6590-49e6-a1e9-f9e22e01a679-catalog-content\") pod \"redhat-marketplace-wjv4n\" (UID: \"61e2aa02-6590-49e6-a1e9-f9e22e01a679\") " pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.449813 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbcm5\" (UniqueName: \"kubernetes.io/projected/61e2aa02-6590-49e6-a1e9-f9e22e01a679-kube-api-access-cbcm5\") pod \"redhat-marketplace-wjv4n\" (UID: \"61e2aa02-6590-49e6-a1e9-f9e22e01a679\") " pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.559423 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.753961 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wjv4n"] Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.921522 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjv4n" event={"ID":"61e2aa02-6590-49e6-a1e9-f9e22e01a679","Type":"ContainerStarted","Data":"bc19d9ec56893630745838b70c3f29be86a9a8479459fdce6a8d2c52e728259b"} Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.921580 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjv4n" event={"ID":"61e2aa02-6590-49e6-a1e9-f9e22e01a679","Type":"ContainerStarted","Data":"1497a5dc5073ab10837554d8bbd5620ae9af435792c945d9939c1ecda3dd7b79"} Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.925233 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerID="102f13f6affdfdc95a068f812dcc74f1e4545b346e6d6cbab95bbc5da8c9a877" exitCode=0 Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.926291 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftlxd" event={"ID":"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea","Type":"ContainerDied","Data":"102f13f6affdfdc95a068f812dcc74f1e4545b346e6d6cbab95bbc5da8c9a877"} Oct 07 08:21:06 crc kubenswrapper[5025]: I1007 08:21:06.926314 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftlxd" event={"ID":"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea","Type":"ContainerStarted","Data":"20c52674907ee6285221673321d88b90199bac664c41f5108f3ec37cfd9c9f67"} Oct 07 08:21:07 crc kubenswrapper[5025]: I1007 08:21:07.933178 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftlxd" event={"ID":"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea","Type":"ContainerStarted","Data":"14486721f9ff1bd8195c2ab675cb39e24541cf9c555fea09a917d4977378616a"} Oct 07 08:21:07 crc kubenswrapper[5025]: I1007 08:21:07.936213 5025 generic.go:334] "Generic (PLEG): container finished" podID="61e2aa02-6590-49e6-a1e9-f9e22e01a679" containerID="bc19d9ec56893630745838b70c3f29be86a9a8479459fdce6a8d2c52e728259b" exitCode=0 Oct 07 08:21:07 crc kubenswrapper[5025]: I1007 08:21:07.936275 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjv4n" event={"ID":"61e2aa02-6590-49e6-a1e9-f9e22e01a679","Type":"ContainerDied","Data":"bc19d9ec56893630745838b70c3f29be86a9a8479459fdce6a8d2c52e728259b"} Oct 07 08:21:07 crc kubenswrapper[5025]: I1007 08:21:07.936311 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjv4n" event={"ID":"61e2aa02-6590-49e6-a1e9-f9e22e01a679","Type":"ContainerStarted","Data":"dc8f1656708b7d7db69e44ab952f3b4282e28a6c39fa0d78b1d9890d3d33f69c"} Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.016656 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dnq7n"] Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.017806 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.025619 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.034727 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnq7n"] Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.147566 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-catalog-content\") pod \"redhat-operators-dnq7n\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.147619 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/c01dd09b-1dca-427d-a45f-9a5870172dd0-kube-api-access-4dq87\") pod \"redhat-operators-dnq7n\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.147676 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-utilities\") pod \"redhat-operators-dnq7n\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.248927 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-utilities\") pod \"redhat-operators-dnq7n\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.249082 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-catalog-content\") pod \"redhat-operators-dnq7n\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.249140 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/c01dd09b-1dca-427d-a45f-9a5870172dd0-kube-api-access-4dq87\") pod \"redhat-operators-dnq7n\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.249775 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-utilities\") pod \"redhat-operators-dnq7n\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.249883 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-catalog-content\") pod \"redhat-operators-dnq7n\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.275648 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/c01dd09b-1dca-427d-a45f-9a5870172dd0-kube-api-access-4dq87\") pod \"redhat-operators-dnq7n\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.343597 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.618108 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pz4zg"] Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.621601 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.625738 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.629007 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pz4zg"] Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.764636 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8afd374e-c526-433f-81ca-9c81457e7591-utilities\") pod \"community-operators-pz4zg\" (UID: \"8afd374e-c526-433f-81ca-9c81457e7591\") " pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.765116 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmffw\" (UniqueName: \"kubernetes.io/projected/8afd374e-c526-433f-81ca-9c81457e7591-kube-api-access-wmffw\") pod \"community-operators-pz4zg\" (UID: \"8afd374e-c526-433f-81ca-9c81457e7591\") " pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.765199 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8afd374e-c526-433f-81ca-9c81457e7591-catalog-content\") pod \"community-operators-pz4zg\" (UID: \"8afd374e-c526-433f-81ca-9c81457e7591\") " pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.776278 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnq7n"] Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.866188 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8afd374e-c526-433f-81ca-9c81457e7591-utilities\") pod \"community-operators-pz4zg\" (UID: \"8afd374e-c526-433f-81ca-9c81457e7591\") " pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.866266 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmffw\" (UniqueName: \"kubernetes.io/projected/8afd374e-c526-433f-81ca-9c81457e7591-kube-api-access-wmffw\") pod \"community-operators-pz4zg\" (UID: \"8afd374e-c526-433f-81ca-9c81457e7591\") " pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.866318 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8afd374e-c526-433f-81ca-9c81457e7591-catalog-content\") pod \"community-operators-pz4zg\" (UID: \"8afd374e-c526-433f-81ca-9c81457e7591\") " pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.866714 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8afd374e-c526-433f-81ca-9c81457e7591-utilities\") pod \"community-operators-pz4zg\" (UID: \"8afd374e-c526-433f-81ca-9c81457e7591\") " pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.866806 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8afd374e-c526-433f-81ca-9c81457e7591-catalog-content\") pod \"community-operators-pz4zg\" (UID: \"8afd374e-c526-433f-81ca-9c81457e7591\") " pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.892479 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmffw\" (UniqueName: \"kubernetes.io/projected/8afd374e-c526-433f-81ca-9c81457e7591-kube-api-access-wmffw\") pod \"community-operators-pz4zg\" (UID: \"8afd374e-c526-433f-81ca-9c81457e7591\") " pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.945042 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.951935 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerID="14486721f9ff1bd8195c2ab675cb39e24541cf9c555fea09a917d4977378616a" exitCode=0 Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.952037 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftlxd" event={"ID":"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea","Type":"ContainerDied","Data":"14486721f9ff1bd8195c2ab675cb39e24541cf9c555fea09a917d4977378616a"} Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.954640 5025 generic.go:334] "Generic (PLEG): container finished" podID="61e2aa02-6590-49e6-a1e9-f9e22e01a679" containerID="dc8f1656708b7d7db69e44ab952f3b4282e28a6c39fa0d78b1d9890d3d33f69c" exitCode=0 Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.954713 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjv4n" event={"ID":"61e2aa02-6590-49e6-a1e9-f9e22e01a679","Type":"ContainerDied","Data":"dc8f1656708b7d7db69e44ab952f3b4282e28a6c39fa0d78b1d9890d3d33f69c"} Oct 07 08:21:08 crc kubenswrapper[5025]: I1007 08:21:08.957089 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnq7n" event={"ID":"c01dd09b-1dca-427d-a45f-9a5870172dd0","Type":"ContainerStarted","Data":"034000796bf6523ddbaa238be1984af7e975e0e34ed948b5dfdd6339a2c9c192"} Oct 07 08:21:09 crc kubenswrapper[5025]: I1007 08:21:09.160182 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pz4zg"] Oct 07 08:21:09 crc kubenswrapper[5025]: W1007 08:21:09.170294 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8afd374e_c526_433f_81ca_9c81457e7591.slice/crio-529136beb922974f9ecf16adeb8bf6227b159f9a7d4fc74f8951b9506bb6e774 WatchSource:0}: Error finding container 529136beb922974f9ecf16adeb8bf6227b159f9a7d4fc74f8951b9506bb6e774: Status 404 returned error can't find the container with id 529136beb922974f9ecf16adeb8bf6227b159f9a7d4fc74f8951b9506bb6e774 Oct 07 08:21:09 crc kubenswrapper[5025]: I1007 08:21:09.964593 5025 generic.go:334] "Generic (PLEG): container finished" podID="8afd374e-c526-433f-81ca-9c81457e7591" containerID="eeed860643a6cde99644bc621f4b5cc76124255482023428dc171d12522929ee" exitCode=0 Oct 07 08:21:09 crc kubenswrapper[5025]: I1007 08:21:09.964787 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4zg" event={"ID":"8afd374e-c526-433f-81ca-9c81457e7591","Type":"ContainerDied","Data":"eeed860643a6cde99644bc621f4b5cc76124255482023428dc171d12522929ee"} Oct 07 08:21:09 crc kubenswrapper[5025]: I1007 08:21:09.965072 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4zg" event={"ID":"8afd374e-c526-433f-81ca-9c81457e7591","Type":"ContainerStarted","Data":"529136beb922974f9ecf16adeb8bf6227b159f9a7d4fc74f8951b9506bb6e774"} Oct 07 08:21:09 crc kubenswrapper[5025]: I1007 08:21:09.967611 5025 generic.go:334] "Generic (PLEG): container finished" podID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerID="06157688eb78a403caa0c5d3cc333bf2c783b4e2c2e88e234a6b37e87a1d9d80" exitCode=0 Oct 07 08:21:09 crc kubenswrapper[5025]: I1007 08:21:09.967651 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnq7n" event={"ID":"c01dd09b-1dca-427d-a45f-9a5870172dd0","Type":"ContainerDied","Data":"06157688eb78a403caa0c5d3cc333bf2c783b4e2c2e88e234a6b37e87a1d9d80"} Oct 07 08:21:10 crc kubenswrapper[5025]: I1007 08:21:10.977227 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftlxd" event={"ID":"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea","Type":"ContainerStarted","Data":"fd8c81f95718b3fb0fe6898dec5d19cdb2eb10d7a7af43230960090e2d85d1a1"} Oct 07 08:21:10 crc kubenswrapper[5025]: I1007 08:21:10.980128 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wjv4n" event={"ID":"61e2aa02-6590-49e6-a1e9-f9e22e01a679","Type":"ContainerStarted","Data":"3e4ab93a64edf69c0b2d13cc96bceac61d424ea3e3b3328e87a96234adc55390"} Oct 07 08:21:11 crc kubenswrapper[5025]: I1007 08:21:11.000761 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ftlxd" podStartSLOduration=3.244922506 podStartE2EDuration="6.000728611s" podCreationTimestamp="2025-10-07 08:21:05 +0000 UTC" firstStartedPulling="2025-10-07 08:21:06.929490363 +0000 UTC m=+273.738804507" lastFinishedPulling="2025-10-07 08:21:09.685296468 +0000 UTC m=+276.494610612" observedRunningTime="2025-10-07 08:21:10.997263419 +0000 UTC m=+277.806577563" watchObservedRunningTime="2025-10-07 08:21:11.000728611 +0000 UTC m=+277.810042765" Oct 07 08:21:11 crc kubenswrapper[5025]: I1007 08:21:11.023231 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wjv4n" podStartSLOduration=1.490437556 podStartE2EDuration="5.023208831s" podCreationTimestamp="2025-10-07 08:21:06 +0000 UTC" firstStartedPulling="2025-10-07 08:21:06.923005146 +0000 UTC m=+273.732319290" lastFinishedPulling="2025-10-07 08:21:10.455776401 +0000 UTC m=+277.265090565" observedRunningTime="2025-10-07 08:21:11.021877229 +0000 UTC m=+277.831191373" watchObservedRunningTime="2025-10-07 08:21:11.023208831 +0000 UTC m=+277.832522975" Oct 07 08:21:11 crc kubenswrapper[5025]: I1007 08:21:11.989315 5025 generic.go:334] "Generic (PLEG): container finished" podID="8afd374e-c526-433f-81ca-9c81457e7591" containerID="e352310122df9cab497602cdec762d58fae5e145dab2de18361d66b352a34055" exitCode=0 Oct 07 08:21:11 crc kubenswrapper[5025]: I1007 08:21:11.989403 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4zg" event={"ID":"8afd374e-c526-433f-81ca-9c81457e7591","Type":"ContainerDied","Data":"e352310122df9cab497602cdec762d58fae5e145dab2de18361d66b352a34055"} Oct 07 08:21:11 crc kubenswrapper[5025]: I1007 08:21:11.991660 5025 generic.go:334] "Generic (PLEG): container finished" podID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerID="c4939ec0839b987c68365e4f55cd1d093fe6639dd97e580c0c2ac2995e9b65be" exitCode=0 Oct 07 08:21:11 crc kubenswrapper[5025]: I1007 08:21:11.991745 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnq7n" event={"ID":"c01dd09b-1dca-427d-a45f-9a5870172dd0","Type":"ContainerDied","Data":"c4939ec0839b987c68365e4f55cd1d093fe6639dd97e580c0c2ac2995e9b65be"} Oct 07 08:21:14 crc kubenswrapper[5025]: I1007 08:21:14.007062 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4zg" event={"ID":"8afd374e-c526-433f-81ca-9c81457e7591","Type":"ContainerStarted","Data":"803747963ba90edde93da639f0e5d84fd7a3d7e8876bf27c0275b8ae924dfe71"} Oct 07 08:21:14 crc kubenswrapper[5025]: I1007 08:21:14.009560 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnq7n" event={"ID":"c01dd09b-1dca-427d-a45f-9a5870172dd0","Type":"ContainerStarted","Data":"fd9b6ed71a60034073e39040b84d7c684481493f9f21c19db28b2d0878e76299"} Oct 07 08:21:14 crc kubenswrapper[5025]: I1007 08:21:14.028098 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pz4zg" podStartSLOduration=2.8203421349999998 podStartE2EDuration="6.028074274s" podCreationTimestamp="2025-10-07 08:21:08 +0000 UTC" firstStartedPulling="2025-10-07 08:21:09.967342205 +0000 UTC m=+276.776656399" lastFinishedPulling="2025-10-07 08:21:13.175074394 +0000 UTC m=+279.984388538" observedRunningTime="2025-10-07 08:21:14.02358476 +0000 UTC m=+280.832898924" watchObservedRunningTime="2025-10-07 08:21:14.028074274 +0000 UTC m=+280.837388418" Oct 07 08:21:14 crc kubenswrapper[5025]: I1007 08:21:14.046167 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dnq7n" podStartSLOduration=2.8933127560000003 podStartE2EDuration="6.046140924s" podCreationTimestamp="2025-10-07 08:21:08 +0000 UTC" firstStartedPulling="2025-10-07 08:21:09.975075513 +0000 UTC m=+276.784389657" lastFinishedPulling="2025-10-07 08:21:13.127903681 +0000 UTC m=+279.937217825" observedRunningTime="2025-10-07 08:21:14.040854014 +0000 UTC m=+280.850168158" watchObservedRunningTime="2025-10-07 08:21:14.046140924 +0000 UTC m=+280.855455078" Oct 07 08:21:15 crc kubenswrapper[5025]: I1007 08:21:15.956075 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:15 crc kubenswrapper[5025]: I1007 08:21:15.956465 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:15 crc kubenswrapper[5025]: I1007 08:21:15.999004 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:16 crc kubenswrapper[5025]: I1007 08:21:16.085751 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 08:21:16 crc kubenswrapper[5025]: I1007 08:21:16.560253 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:16 crc kubenswrapper[5025]: I1007 08:21:16.560312 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:16 crc kubenswrapper[5025]: I1007 08:21:16.603570 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:17 crc kubenswrapper[5025]: I1007 08:21:17.071922 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wjv4n" Oct 07 08:21:18 crc kubenswrapper[5025]: I1007 08:21:18.344168 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:18 crc kubenswrapper[5025]: I1007 08:21:18.344533 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:18 crc kubenswrapper[5025]: I1007 08:21:18.394882 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:21:18 crc kubenswrapper[5025]: I1007 08:21:18.946313 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:18 crc kubenswrapper[5025]: I1007 08:21:18.947596 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:18 crc kubenswrapper[5025]: I1007 08:21:18.990426 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:19 crc kubenswrapper[5025]: I1007 08:21:19.081512 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pz4zg" Oct 07 08:21:19 crc kubenswrapper[5025]: I1007 08:21:19.085263 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:22:25 crc kubenswrapper[5025]: I1007 08:22:25.934238 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:22:25 crc kubenswrapper[5025]: I1007 08:22:25.934966 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:22:55 crc kubenswrapper[5025]: I1007 08:22:55.935007 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:22:55 crc kubenswrapper[5025]: I1007 08:22:55.936128 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:23:25 crc kubenswrapper[5025]: I1007 08:23:25.934440 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:23:25 crc kubenswrapper[5025]: I1007 08:23:25.935248 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:23:25 crc kubenswrapper[5025]: I1007 08:23:25.935333 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:23:25 crc kubenswrapper[5025]: I1007 08:23:25.936287 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2ddfe2fe8e8b2bfab09bd0fa5fc060581707876ce97a518502d1ad38c21cc8d"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:23:25 crc kubenswrapper[5025]: I1007 08:23:25.936396 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://b2ddfe2fe8e8b2bfab09bd0fa5fc060581707876ce97a518502d1ad38c21cc8d" gracePeriod=600 Oct 07 08:23:26 crc kubenswrapper[5025]: I1007 08:23:26.975593 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="b2ddfe2fe8e8b2bfab09bd0fa5fc060581707876ce97a518502d1ad38c21cc8d" exitCode=0 Oct 07 08:23:26 crc kubenswrapper[5025]: I1007 08:23:26.975681 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"b2ddfe2fe8e8b2bfab09bd0fa5fc060581707876ce97a518502d1ad38c21cc8d"} Oct 07 08:23:26 crc kubenswrapper[5025]: I1007 08:23:26.976531 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"4bbebcd2746f0ec67146ae8e43e7aa013e115cc6aa41a0609e38fed295949dae"} Oct 07 08:23:26 crc kubenswrapper[5025]: I1007 08:23:26.976587 5025 scope.go:117] "RemoveContainer" containerID="54b2a654066f14b0433198403ba42b83c61922de1055c94ba1d38aad9e0c2821" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.765488 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-246k6"] Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.767442 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.789350 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-246k6"] Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.897669 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/215fc810-1260-43e3-92eb-0e21a63e9aa9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.897711 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/215fc810-1260-43e3-92eb-0e21a63e9aa9-bound-sa-token\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.897738 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/215fc810-1260-43e3-92eb-0e21a63e9aa9-registry-tls\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.897757 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/215fc810-1260-43e3-92eb-0e21a63e9aa9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.897799 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/215fc810-1260-43e3-92eb-0e21a63e9aa9-trusted-ca\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.897828 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/215fc810-1260-43e3-92eb-0e21a63e9aa9-registry-certificates\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.897867 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.897901 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66ppq\" (UniqueName: \"kubernetes.io/projected/215fc810-1260-43e3-92eb-0e21a63e9aa9-kube-api-access-66ppq\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.925390 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.999169 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66ppq\" (UniqueName: \"kubernetes.io/projected/215fc810-1260-43e3-92eb-0e21a63e9aa9-kube-api-access-66ppq\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.999235 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/215fc810-1260-43e3-92eb-0e21a63e9aa9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.999268 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/215fc810-1260-43e3-92eb-0e21a63e9aa9-bound-sa-token\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.999392 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/215fc810-1260-43e3-92eb-0e21a63e9aa9-registry-tls\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.999497 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/215fc810-1260-43e3-92eb-0e21a63e9aa9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.999645 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/215fc810-1260-43e3-92eb-0e21a63e9aa9-trusted-ca\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:52 crc kubenswrapper[5025]: I1007 08:23:52.999761 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/215fc810-1260-43e3-92eb-0e21a63e9aa9-registry-certificates\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:53 crc kubenswrapper[5025]: I1007 08:23:53.000307 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/215fc810-1260-43e3-92eb-0e21a63e9aa9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:53 crc kubenswrapper[5025]: I1007 08:23:53.001515 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/215fc810-1260-43e3-92eb-0e21a63e9aa9-trusted-ca\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:53 crc kubenswrapper[5025]: I1007 08:23:53.001529 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/215fc810-1260-43e3-92eb-0e21a63e9aa9-registry-certificates\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:53 crc kubenswrapper[5025]: I1007 08:23:53.013113 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/215fc810-1260-43e3-92eb-0e21a63e9aa9-registry-tls\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:53 crc kubenswrapper[5025]: I1007 08:23:53.013145 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/215fc810-1260-43e3-92eb-0e21a63e9aa9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:53 crc kubenswrapper[5025]: I1007 08:23:53.019061 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66ppq\" (UniqueName: \"kubernetes.io/projected/215fc810-1260-43e3-92eb-0e21a63e9aa9-kube-api-access-66ppq\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:53 crc kubenswrapper[5025]: I1007 08:23:53.029255 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/215fc810-1260-43e3-92eb-0e21a63e9aa9-bound-sa-token\") pod \"image-registry-66df7c8f76-246k6\" (UID: \"215fc810-1260-43e3-92eb-0e21a63e9aa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:53 crc kubenswrapper[5025]: I1007 08:23:53.087627 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:53 crc kubenswrapper[5025]: I1007 08:23:53.339187 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-246k6"] Oct 07 08:23:54 crc kubenswrapper[5025]: I1007 08:23:54.160627 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-246k6" event={"ID":"215fc810-1260-43e3-92eb-0e21a63e9aa9","Type":"ContainerStarted","Data":"8ba22480056a04892cab818edf56bd7af6010b479e7d075b3acb5cc044165597"} Oct 07 08:23:54 crc kubenswrapper[5025]: I1007 08:23:54.161705 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-246k6" event={"ID":"215fc810-1260-43e3-92eb-0e21a63e9aa9","Type":"ContainerStarted","Data":"4f0c030546ac213314dca6a543f3b28bc8f4f5b9a0518e036a74d38cbfaaaf83"} Oct 07 08:23:54 crc kubenswrapper[5025]: I1007 08:23:54.161770 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:23:54 crc kubenswrapper[5025]: I1007 08:23:54.184252 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-246k6" podStartSLOduration=2.184218562 podStartE2EDuration="2.184218562s" podCreationTimestamp="2025-10-07 08:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:23:54.181169313 +0000 UTC m=+440.990483467" watchObservedRunningTime="2025-10-07 08:23:54.184218562 +0000 UTC m=+440.993532756" Oct 07 08:24:13 crc kubenswrapper[5025]: I1007 08:24:13.095878 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-246k6" Oct 07 08:24:13 crc kubenswrapper[5025]: I1007 08:24:13.179488 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2r94k"] Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.220894 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" podUID="d11df844-49f0-4c9a-9de5-701e57c69685" containerName="registry" containerID="cri-o://ab6e635ad109e876d99329499bb911c8278b3a342759918efaf56a5c7ed56c46" gracePeriod=30 Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.490596 5025 generic.go:334] "Generic (PLEG): container finished" podID="d11df844-49f0-4c9a-9de5-701e57c69685" containerID="ab6e635ad109e876d99329499bb911c8278b3a342759918efaf56a5c7ed56c46" exitCode=0 Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.490659 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" event={"ID":"d11df844-49f0-4c9a-9de5-701e57c69685","Type":"ContainerDied","Data":"ab6e635ad109e876d99329499bb911c8278b3a342759918efaf56a5c7ed56c46"} Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.591759 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.736820 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d11df844-49f0-4c9a-9de5-701e57c69685-installation-pull-secrets\") pod \"d11df844-49f0-4c9a-9de5-701e57c69685\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.736914 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d11df844-49f0-4c9a-9de5-701e57c69685-ca-trust-extracted\") pod \"d11df844-49f0-4c9a-9de5-701e57c69685\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.736956 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsmb2\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-kube-api-access-gsmb2\") pod \"d11df844-49f0-4c9a-9de5-701e57c69685\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.737118 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d11df844-49f0-4c9a-9de5-701e57c69685\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.737140 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-registry-tls\") pod \"d11df844-49f0-4c9a-9de5-701e57c69685\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.737186 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-bound-sa-token\") pod \"d11df844-49f0-4c9a-9de5-701e57c69685\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.737221 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-registry-certificates\") pod \"d11df844-49f0-4c9a-9de5-701e57c69685\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.737240 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-trusted-ca\") pod \"d11df844-49f0-4c9a-9de5-701e57c69685\" (UID: \"d11df844-49f0-4c9a-9de5-701e57c69685\") " Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.739171 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d11df844-49f0-4c9a-9de5-701e57c69685" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.739680 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d11df844-49f0-4c9a-9de5-701e57c69685" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.744747 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d11df844-49f0-4c9a-9de5-701e57c69685" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.747755 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11df844-49f0-4c9a-9de5-701e57c69685-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d11df844-49f0-4c9a-9de5-701e57c69685" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.747945 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d11df844-49f0-4c9a-9de5-701e57c69685" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.748058 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-kube-api-access-gsmb2" (OuterVolumeSpecName: "kube-api-access-gsmb2") pod "d11df844-49f0-4c9a-9de5-701e57c69685" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685"). InnerVolumeSpecName "kube-api-access-gsmb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.752187 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d11df844-49f0-4c9a-9de5-701e57c69685" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.758255 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11df844-49f0-4c9a-9de5-701e57c69685-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d11df844-49f0-4c9a-9de5-701e57c69685" (UID: "d11df844-49f0-4c9a-9de5-701e57c69685"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.838139 5025 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d11df844-49f0-4c9a-9de5-701e57c69685-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.838175 5025 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d11df844-49f0-4c9a-9de5-701e57c69685-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.838183 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsmb2\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-kube-api-access-gsmb2\") on node \"crc\" DevicePath \"\"" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.838193 5025 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.838202 5025 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d11df844-49f0-4c9a-9de5-701e57c69685-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.838210 5025 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 08:24:38 crc kubenswrapper[5025]: I1007 08:24:38.838219 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d11df844-49f0-4c9a-9de5-701e57c69685-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:24:39 crc kubenswrapper[5025]: I1007 08:24:39.501806 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" event={"ID":"d11df844-49f0-4c9a-9de5-701e57c69685","Type":"ContainerDied","Data":"7943a811149f7a376febf6be333a7a353440cf97170741c4139204be5e89a6b5"} Oct 07 08:24:39 crc kubenswrapper[5025]: I1007 08:24:39.501874 5025 scope.go:117] "RemoveContainer" containerID="ab6e635ad109e876d99329499bb911c8278b3a342759918efaf56a5c7ed56c46" Oct 07 08:24:39 crc kubenswrapper[5025]: I1007 08:24:39.501910 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2r94k" Oct 07 08:24:39 crc kubenswrapper[5025]: I1007 08:24:39.553637 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2r94k"] Oct 07 08:24:39 crc kubenswrapper[5025]: I1007 08:24:39.559123 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2r94k"] Oct 07 08:24:39 crc kubenswrapper[5025]: I1007 08:24:39.938475 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11df844-49f0-4c9a-9de5-701e57c69685" path="/var/lib/kubelet/pods/d11df844-49f0-4c9a-9de5-701e57c69685/volumes" Oct 07 08:25:55 crc kubenswrapper[5025]: I1007 08:25:55.934056 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:25:55 crc kubenswrapper[5025]: I1007 08:25:55.934699 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:26:25 crc kubenswrapper[5025]: I1007 08:26:25.933983 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:26:25 crc kubenswrapper[5025]: I1007 08:26:25.934939 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:26:55 crc kubenswrapper[5025]: I1007 08:26:55.934891 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:26:55 crc kubenswrapper[5025]: I1007 08:26:55.935747 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:26:55 crc kubenswrapper[5025]: I1007 08:26:55.935801 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:26:55 crc kubenswrapper[5025]: I1007 08:26:55.936501 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bbebcd2746f0ec67146ae8e43e7aa013e115cc6aa41a0609e38fed295949dae"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:26:55 crc kubenswrapper[5025]: I1007 08:26:55.936619 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://4bbebcd2746f0ec67146ae8e43e7aa013e115cc6aa41a0609e38fed295949dae" gracePeriod=600 Oct 07 08:26:56 crc kubenswrapper[5025]: I1007 08:26:56.527736 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="4bbebcd2746f0ec67146ae8e43e7aa013e115cc6aa41a0609e38fed295949dae" exitCode=0 Oct 07 08:26:56 crc kubenswrapper[5025]: I1007 08:26:56.527807 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"4bbebcd2746f0ec67146ae8e43e7aa013e115cc6aa41a0609e38fed295949dae"} Oct 07 08:26:56 crc kubenswrapper[5025]: I1007 08:26:56.528513 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"e73f6c9680aced0c67b362a72f57135c1bc51cd7c5425767d6e7d7481054ac9b"} Oct 07 08:26:56 crc kubenswrapper[5025]: I1007 08:26:56.528570 5025 scope.go:117] "RemoveContainer" containerID="b2ddfe2fe8e8b2bfab09bd0fa5fc060581707876ce97a518502d1ad38c21cc8d" Oct 07 08:27:45 crc kubenswrapper[5025]: I1007 08:27:45.912483 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-m545c"] Oct 07 08:27:45 crc kubenswrapper[5025]: E1007 08:27:45.914034 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11df844-49f0-4c9a-9de5-701e57c69685" containerName="registry" Oct 07 08:27:45 crc kubenswrapper[5025]: I1007 08:27:45.914051 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11df844-49f0-4c9a-9de5-701e57c69685" containerName="registry" Oct 07 08:27:45 crc kubenswrapper[5025]: I1007 08:27:45.914130 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11df844-49f0-4c9a-9de5-701e57c69685" containerName="registry" Oct 07 08:27:45 crc kubenswrapper[5025]: I1007 08:27:45.914699 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:45 crc kubenswrapper[5025]: I1007 08:27:45.919190 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 07 08:27:45 crc kubenswrapper[5025]: I1007 08:27:45.922737 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 07 08:27:45 crc kubenswrapper[5025]: I1007 08:27:45.922927 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 07 08:27:45 crc kubenswrapper[5025]: I1007 08:27:45.926718 5025 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-v9ccp" Oct 07 08:27:45 crc kubenswrapper[5025]: I1007 08:27:45.947836 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-m545c"] Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.035892 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0060b25b-92cb-4f4f-b84b-c206e80fae98-crc-storage\") pod \"crc-storage-crc-m545c\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.035957 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0060b25b-92cb-4f4f-b84b-c206e80fae98-node-mnt\") pod \"crc-storage-crc-m545c\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.035998 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhgrs\" (UniqueName: \"kubernetes.io/projected/0060b25b-92cb-4f4f-b84b-c206e80fae98-kube-api-access-fhgrs\") pod \"crc-storage-crc-m545c\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.137731 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0060b25b-92cb-4f4f-b84b-c206e80fae98-crc-storage\") pod \"crc-storage-crc-m545c\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.138099 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0060b25b-92cb-4f4f-b84b-c206e80fae98-node-mnt\") pod \"crc-storage-crc-m545c\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.138253 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhgrs\" (UniqueName: \"kubernetes.io/projected/0060b25b-92cb-4f4f-b84b-c206e80fae98-kube-api-access-fhgrs\") pod \"crc-storage-crc-m545c\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.138530 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0060b25b-92cb-4f4f-b84b-c206e80fae98-node-mnt\") pod \"crc-storage-crc-m545c\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.139844 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0060b25b-92cb-4f4f-b84b-c206e80fae98-crc-storage\") pod \"crc-storage-crc-m545c\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.173132 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhgrs\" (UniqueName: \"kubernetes.io/projected/0060b25b-92cb-4f4f-b84b-c206e80fae98-kube-api-access-fhgrs\") pod \"crc-storage-crc-m545c\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.234840 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.426360 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-m545c"] Oct 07 08:27:46 crc kubenswrapper[5025]: W1007 08:27:46.445527 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0060b25b_92cb_4f4f_b84b_c206e80fae98.slice/crio-45ebb40f13bfc2f0f334f5140e9831abcbabe6939a0132b9dd8e0075367041e6 WatchSource:0}: Error finding container 45ebb40f13bfc2f0f334f5140e9831abcbabe6939a0132b9dd8e0075367041e6: Status 404 returned error can't find the container with id 45ebb40f13bfc2f0f334f5140e9831abcbabe6939a0132b9dd8e0075367041e6 Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.449031 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 08:27:46 crc kubenswrapper[5025]: I1007 08:27:46.892534 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-m545c" event={"ID":"0060b25b-92cb-4f4f-b84b-c206e80fae98","Type":"ContainerStarted","Data":"45ebb40f13bfc2f0f334f5140e9831abcbabe6939a0132b9dd8e0075367041e6"} Oct 07 08:27:47 crc kubenswrapper[5025]: I1007 08:27:47.899901 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-m545c" event={"ID":"0060b25b-92cb-4f4f-b84b-c206e80fae98","Type":"ContainerStarted","Data":"f6ac45e08e62a7ae334313e44b573c187b48ad1eb8cec81e79600d6caf011f2f"} Oct 07 08:27:47 crc kubenswrapper[5025]: I1007 08:27:47.920383 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-m545c" podStartSLOduration=1.634765926 podStartE2EDuration="2.920357605s" podCreationTimestamp="2025-10-07 08:27:45 +0000 UTC" firstStartedPulling="2025-10-07 08:27:46.448808933 +0000 UTC m=+673.258123077" lastFinishedPulling="2025-10-07 08:27:47.734400602 +0000 UTC m=+674.543714756" observedRunningTime="2025-10-07 08:27:47.915905824 +0000 UTC m=+674.725219988" watchObservedRunningTime="2025-10-07 08:27:47.920357605 +0000 UTC m=+674.729671759" Oct 07 08:27:48 crc kubenswrapper[5025]: I1007 08:27:48.909148 5025 generic.go:334] "Generic (PLEG): container finished" podID="0060b25b-92cb-4f4f-b84b-c206e80fae98" containerID="f6ac45e08e62a7ae334313e44b573c187b48ad1eb8cec81e79600d6caf011f2f" exitCode=0 Oct 07 08:27:48 crc kubenswrapper[5025]: I1007 08:27:48.909230 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-m545c" event={"ID":"0060b25b-92cb-4f4f-b84b-c206e80fae98","Type":"ContainerDied","Data":"f6ac45e08e62a7ae334313e44b573c187b48ad1eb8cec81e79600d6caf011f2f"} Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.209066 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.224379 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhgrs\" (UniqueName: \"kubernetes.io/projected/0060b25b-92cb-4f4f-b84b-c206e80fae98-kube-api-access-fhgrs\") pod \"0060b25b-92cb-4f4f-b84b-c206e80fae98\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.224450 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0060b25b-92cb-4f4f-b84b-c206e80fae98-node-mnt\") pod \"0060b25b-92cb-4f4f-b84b-c206e80fae98\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.224536 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0060b25b-92cb-4f4f-b84b-c206e80fae98-crc-storage\") pod \"0060b25b-92cb-4f4f-b84b-c206e80fae98\" (UID: \"0060b25b-92cb-4f4f-b84b-c206e80fae98\") " Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.224654 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0060b25b-92cb-4f4f-b84b-c206e80fae98-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "0060b25b-92cb-4f4f-b84b-c206e80fae98" (UID: "0060b25b-92cb-4f4f-b84b-c206e80fae98"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.228815 5025 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0060b25b-92cb-4f4f-b84b-c206e80fae98-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.234838 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0060b25b-92cb-4f4f-b84b-c206e80fae98-kube-api-access-fhgrs" (OuterVolumeSpecName: "kube-api-access-fhgrs") pod "0060b25b-92cb-4f4f-b84b-c206e80fae98" (UID: "0060b25b-92cb-4f4f-b84b-c206e80fae98"). InnerVolumeSpecName "kube-api-access-fhgrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.248753 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0060b25b-92cb-4f4f-b84b-c206e80fae98-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "0060b25b-92cb-4f4f-b84b-c206e80fae98" (UID: "0060b25b-92cb-4f4f-b84b-c206e80fae98"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.329904 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhgrs\" (UniqueName: \"kubernetes.io/projected/0060b25b-92cb-4f4f-b84b-c206e80fae98-kube-api-access-fhgrs\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.330219 5025 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0060b25b-92cb-4f4f-b84b-c206e80fae98-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.943276 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-m545c" event={"ID":"0060b25b-92cb-4f4f-b84b-c206e80fae98","Type":"ContainerDied","Data":"45ebb40f13bfc2f0f334f5140e9831abcbabe6939a0132b9dd8e0075367041e6"} Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.943344 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45ebb40f13bfc2f0f334f5140e9831abcbabe6939a0132b9dd8e0075367041e6" Oct 07 08:27:50 crc kubenswrapper[5025]: I1007 08:27:50.943368 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-m545c" Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.679472 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dwm22"] Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.680280 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovn-controller" containerID="cri-o://81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe" gracePeriod=30 Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.680434 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f" gracePeriod=30 Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.680393 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="northd" containerID="cri-o://4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429" gracePeriod=30 Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.680477 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kube-rbac-proxy-node" containerID="cri-o://602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf" gracePeriod=30 Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.680520 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovn-acl-logging" containerID="cri-o://e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f" gracePeriod=30 Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.680596 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="sbdb" containerID="cri-o://d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20" gracePeriod=30 Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.680665 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="nbdb" containerID="cri-o://aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4" gracePeriod=30 Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.730970 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" containerID="cri-o://81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9" gracePeriod=30 Oct 07 08:27:55 crc kubenswrapper[5025]: E1007 08:27:55.803384 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 07 08:27:55 crc kubenswrapper[5025]: E1007 08:27:55.808249 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 07 08:27:55 crc kubenswrapper[5025]: E1007 08:27:55.809715 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 07 08:27:55 crc kubenswrapper[5025]: E1007 08:27:55.809757 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="nbdb" Oct 07 08:27:55 crc kubenswrapper[5025]: E1007 08:27:55.809767 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 07 08:27:55 crc kubenswrapper[5025]: E1007 08:27:55.813634 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 07 08:27:55 crc kubenswrapper[5025]: E1007 08:27:55.814992 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 07 08:27:55 crc kubenswrapper[5025]: E1007 08:27:55.815025 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="sbdb" Oct 07 08:27:55 crc kubenswrapper[5025]: I1007 08:27:55.997353 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovnkube-controller/3.log" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.000298 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovn-acl-logging/0.log" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.000881 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovn-controller/0.log" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004410 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9" exitCode=0 Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004448 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20" exitCode=0 Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004461 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f" exitCode=0 Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004470 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf" exitCode=0 Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004482 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f" exitCode=143 Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004495 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe" exitCode=143 Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004508 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9"} Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004586 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20"} Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004598 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f"} Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004607 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf"} Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004615 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f"} Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004623 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe"} Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.004641 5025 scope.go:117] "RemoveContainer" containerID="49dfbab16b681fd6273294b7ececcd31c7feb0d79b66129ceb05740c3a478abe" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.008305 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/2.log" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.008665 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/1.log" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.008696 5025 generic.go:334] "Generic (PLEG): container finished" podID="34b07a69-1bbf-4019-b824-7b5be0f9404d" containerID="74509613ab9ea3b6e4feae1711cea3c8b8082ee266d8f3e819add82c25e9eec3" exitCode=2 Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.008723 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmhw6" event={"ID":"34b07a69-1bbf-4019-b824-7b5be0f9404d","Type":"ContainerDied","Data":"74509613ab9ea3b6e4feae1711cea3c8b8082ee266d8f3e819add82c25e9eec3"} Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.009152 5025 scope.go:117] "RemoveContainer" containerID="74509613ab9ea3b6e4feae1711cea3c8b8082ee266d8f3e819add82c25e9eec3" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.009416 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xmhw6_openshift-multus(34b07a69-1bbf-4019-b824-7b5be0f9404d)\"" pod="openshift-multus/multus-xmhw6" podUID="34b07a69-1bbf-4019-b824-7b5be0f9404d" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.037470 5025 scope.go:117] "RemoveContainer" containerID="79252af7f0e70293c6c57e8731796bdf2108b3074f9cc3a5627c41eb246032c6" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.052607 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovn-acl-logging/0.log" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.053063 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovn-controller/0.log" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.054075 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107164 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-llbmv"] Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107391 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107414 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107425 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovn-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107430 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovn-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107437 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kube-rbac-proxy-node" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107443 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kube-rbac-proxy-node" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107452 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovn-acl-logging" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107458 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovn-acl-logging" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107468 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="northd" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107474 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="northd" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107484 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kubecfg-setup" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107490 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kubecfg-setup" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107501 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107507 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107515 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0060b25b-92cb-4f4f-b84b-c206e80fae98" containerName="storage" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107520 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="0060b25b-92cb-4f4f-b84b-c206e80fae98" containerName="storage" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107527 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107532 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107554 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="nbdb" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107560 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="nbdb" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107571 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107578 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107584 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107589 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107596 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="sbdb" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107601 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="sbdb" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107693 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="sbdb" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107706 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107713 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107720 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="nbdb" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107727 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="northd" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107736 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovn-acl-logging" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107742 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kube-rbac-proxy-node" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107751 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="0060b25b-92cb-4f4f-b84b-c206e80fae98" containerName="storage" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107760 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovn-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107767 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107776 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: E1007 08:27:56.107859 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107867 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.107942 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.108108 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerName="ovnkube-controller" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.109438 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118289 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-systemd\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118347 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-netd\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118378 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-systemd-units\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118392 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-ovn\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118417 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118442 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-config\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118464 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-slash\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118487 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-script-lib\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118503 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-bin\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118553 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-log-socket\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118575 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovn-node-metrics-cert\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118591 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-kubelet\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118617 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-node-log\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118639 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-openvswitch\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118656 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-env-overrides\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118672 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpm25\" (UniqueName: \"kubernetes.io/projected/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-kube-api-access-zpm25\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118697 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-var-lib-openvswitch\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118717 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-ovn-kubernetes\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118734 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-netns\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118753 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-etc-openvswitch\") pod \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\" (UID: \"8b6b9c75-ecfe-4815-b279-bb56f57a82a8\") " Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118870 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-cni-netd\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118902 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-run-systemd\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118924 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-run-openvswitch\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118939 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-log-socket\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118956 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118972 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-slash\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.118994 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-run-ovn-kubernetes\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119010 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-node-log\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119029 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-run-ovn\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119052 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14b4b7dd-3970-480d-9aae-8c1e58cc2316-ovn-node-metrics-cert\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119067 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14b4b7dd-3970-480d-9aae-8c1e58cc2316-env-overrides\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119083 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14b4b7dd-3970-480d-9aae-8c1e58cc2316-ovnkube-config\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119099 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14b4b7dd-3970-480d-9aae-8c1e58cc2316-ovnkube-script-lib\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119112 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-cni-bin\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119125 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-run-netns\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119146 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-etc-openvswitch\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119167 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf946\" (UniqueName: \"kubernetes.io/projected/14b4b7dd-3970-480d-9aae-8c1e58cc2316-kube-api-access-jf946\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119187 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-systemd-units\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119202 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-var-lib-openvswitch\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119225 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-kubelet\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119306 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119332 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119349 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119367 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119780 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119812 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-log-socket" (OuterVolumeSpecName: "log-socket") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119848 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-slash" (OuterVolumeSpecName: "host-slash") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.119933 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.120212 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.120237 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.120878 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.120901 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.120919 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.120935 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.120957 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-node-log" (OuterVolumeSpecName: "node-log") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.120973 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.120993 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.125443 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.127031 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-kube-api-access-zpm25" (OuterVolumeSpecName: "kube-api-access-zpm25") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "kube-api-access-zpm25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.139036 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8b6b9c75-ecfe-4815-b279-bb56f57a82a8" (UID: "8b6b9c75-ecfe-4815-b279-bb56f57a82a8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220055 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-kubelet\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220120 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-cni-netd\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220144 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-run-systemd\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220170 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-run-openvswitch\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220167 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-kubelet\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220192 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-slash\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220224 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-cni-netd\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220226 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-log-socket\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220252 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-log-socket\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220261 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220279 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-run-openvswitch\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220292 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-node-log\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220284 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-run-systemd\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220310 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-run-ovn-kubernetes\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220312 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220319 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-slash\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220352 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-node-log\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220358 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-run-ovn\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220343 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-run-ovn-kubernetes\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220384 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-run-ovn\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220447 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14b4b7dd-3970-480d-9aae-8c1e58cc2316-ovn-node-metrics-cert\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220469 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14b4b7dd-3970-480d-9aae-8c1e58cc2316-env-overrides\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220489 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14b4b7dd-3970-480d-9aae-8c1e58cc2316-ovnkube-config\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220507 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14b4b7dd-3970-480d-9aae-8c1e58cc2316-ovnkube-script-lib\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220524 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-run-netns\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220557 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-cni-bin\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220595 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-etc-openvswitch\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220620 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf946\" (UniqueName: \"kubernetes.io/projected/14b4b7dd-3970-480d-9aae-8c1e58cc2316-kube-api-access-jf946\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220653 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-systemd-units\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220670 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-var-lib-openvswitch\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220745 5025 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-node-log\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220754 5025 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220763 5025 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220774 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpm25\" (UniqueName: \"kubernetes.io/projected/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-kube-api-access-zpm25\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220783 5025 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220792 5025 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220801 5025 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220809 5025 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220819 5025 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220828 5025 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220837 5025 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220844 5025 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220853 5025 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220861 5025 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220869 5025 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-slash\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220880 5025 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220888 5025 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220896 5025 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-log-socket\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220904 5025 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220915 5025 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b6b9c75-ecfe-4815-b279-bb56f57a82a8-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.220943 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-var-lib-openvswitch\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.221060 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14b4b7dd-3970-480d-9aae-8c1e58cc2316-env-overrides\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.221103 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-cni-bin\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.221432 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14b4b7dd-3970-480d-9aae-8c1e58cc2316-ovnkube-config\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.221467 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-etc-openvswitch\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.221599 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-host-run-netns\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.221615 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14b4b7dd-3970-480d-9aae-8c1e58cc2316-ovnkube-script-lib\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.221640 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14b4b7dd-3970-480d-9aae-8c1e58cc2316-systemd-units\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.224403 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14b4b7dd-3970-480d-9aae-8c1e58cc2316-ovn-node-metrics-cert\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.238006 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf946\" (UniqueName: \"kubernetes.io/projected/14b4b7dd-3970-480d-9aae-8c1e58cc2316-kube-api-access-jf946\") pod \"ovnkube-node-llbmv\" (UID: \"14b4b7dd-3970-480d-9aae-8c1e58cc2316\") " pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.424785 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.776491 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x"] Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.778299 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.784342 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.827574 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.827609 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.827682 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp5gj\" (UniqueName: \"kubernetes.io/projected/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-kube-api-access-kp5gj\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.928999 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5gj\" (UniqueName: \"kubernetes.io/projected/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-kube-api-access-kp5gj\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.929048 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.929063 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.930279 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.930829 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:56 crc kubenswrapper[5025]: I1007 08:27:56.961925 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5gj\" (UniqueName: \"kubernetes.io/projected/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-kube-api-access-kp5gj\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.023130 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovn-acl-logging/0.log" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.024234 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwm22_8b6b9c75-ecfe-4815-b279-bb56f57a82a8/ovn-controller/0.log" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.024763 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4" exitCode=0 Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.024817 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" containerID="4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429" exitCode=0 Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.024903 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.024922 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4"} Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.025014 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429"} Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.025048 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwm22" event={"ID":"8b6b9c75-ecfe-4815-b279-bb56f57a82a8","Type":"ContainerDied","Data":"15b2374e4b04e9cc206a6399ef07641377372a41d7c5a0d9cec0492d4db986e8"} Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.025095 5025 scope.go:117] "RemoveContainer" containerID="81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.028152 5025 generic.go:334] "Generic (PLEG): container finished" podID="14b4b7dd-3970-480d-9aae-8c1e58cc2316" containerID="7a33e731b8da7b98657c9a424714b25c404504a29c691824d6015001a52512c9" exitCode=0 Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.028230 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerDied","Data":"7a33e731b8da7b98657c9a424714b25c404504a29c691824d6015001a52512c9"} Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.028284 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerStarted","Data":"24ddf3d419df2869204fc85bce617da6d5e310a9ab11d21d83c5fe032ed39ba2"} Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.035077 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/2.log" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.067495 5025 scope.go:117] "RemoveContainer" containerID="d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.104215 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.104837 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dwm22"] Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.109030 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dwm22"] Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.124506 5025 scope.go:117] "RemoveContainer" containerID="aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.144002 5025 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(e3c204d7b9b87d33bbdd85c1a0537492fa98816c89b80981655ae9f246d99e7c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.144100 5025 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(e3c204d7b9b87d33bbdd85c1a0537492fa98816c89b80981655ae9f246d99e7c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.144139 5025 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(e3c204d7b9b87d33bbdd85c1a0537492fa98816c89b80981655ae9f246d99e7c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.144234 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace(e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace(e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(e3c204d7b9b87d33bbdd85c1a0537492fa98816c89b80981655ae9f246d99e7c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.170651 5025 scope.go:117] "RemoveContainer" containerID="4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.194973 5025 scope.go:117] "RemoveContainer" containerID="d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.210187 5025 scope.go:117] "RemoveContainer" containerID="602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.221281 5025 scope.go:117] "RemoveContainer" containerID="e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.238174 5025 scope.go:117] "RemoveContainer" containerID="81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.255653 5025 scope.go:117] "RemoveContainer" containerID="cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.273147 5025 scope.go:117] "RemoveContainer" containerID="81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.273761 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9\": container with ID starting with 81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9 not found: ID does not exist" containerID="81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.273813 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9"} err="failed to get container status \"81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9\": rpc error: code = NotFound desc = could not find container \"81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9\": container with ID starting with 81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.273846 5025 scope.go:117] "RemoveContainer" containerID="d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.274152 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\": container with ID starting with d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20 not found: ID does not exist" containerID="d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.274193 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20"} err="failed to get container status \"d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\": rpc error: code = NotFound desc = could not find container \"d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\": container with ID starting with d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.274223 5025 scope.go:117] "RemoveContainer" containerID="aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.274517 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\": container with ID starting with aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4 not found: ID does not exist" containerID="aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.274564 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4"} err="failed to get container status \"aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\": rpc error: code = NotFound desc = could not find container \"aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\": container with ID starting with aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.274579 5025 scope.go:117] "RemoveContainer" containerID="4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.274854 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\": container with ID starting with 4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429 not found: ID does not exist" containerID="4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.274882 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429"} err="failed to get container status \"4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\": rpc error: code = NotFound desc = could not find container \"4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\": container with ID starting with 4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.274897 5025 scope.go:117] "RemoveContainer" containerID="d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.275128 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\": container with ID starting with d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f not found: ID does not exist" containerID="d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.275160 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f"} err="failed to get container status \"d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\": rpc error: code = NotFound desc = could not find container \"d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\": container with ID starting with d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.275180 5025 scope.go:117] "RemoveContainer" containerID="602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.275411 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\": container with ID starting with 602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf not found: ID does not exist" containerID="602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.275430 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf"} err="failed to get container status \"602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\": rpc error: code = NotFound desc = could not find container \"602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\": container with ID starting with 602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.275442 5025 scope.go:117] "RemoveContainer" containerID="e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.275681 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\": container with ID starting with e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f not found: ID does not exist" containerID="e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.275697 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f"} err="failed to get container status \"e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\": rpc error: code = NotFound desc = could not find container \"e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\": container with ID starting with e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.275710 5025 scope.go:117] "RemoveContainer" containerID="81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.276167 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\": container with ID starting with 81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe not found: ID does not exist" containerID="81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.276192 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe"} err="failed to get container status \"81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\": rpc error: code = NotFound desc = could not find container \"81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\": container with ID starting with 81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.276207 5025 scope.go:117] "RemoveContainer" containerID="cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58" Oct 07 08:27:57 crc kubenswrapper[5025]: E1007 08:27:57.276440 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\": container with ID starting with cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58 not found: ID does not exist" containerID="cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.276464 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58"} err="failed to get container status \"cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\": rpc error: code = NotFound desc = could not find container \"cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\": container with ID starting with cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.276479 5025 scope.go:117] "RemoveContainer" containerID="81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.276706 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9"} err="failed to get container status \"81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9\": rpc error: code = NotFound desc = could not find container \"81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9\": container with ID starting with 81125d0541fed3ecceacb54cd47d20ae6cf3ad43375daaf49fbf813859f4f2e9 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.276732 5025 scope.go:117] "RemoveContainer" containerID="d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.276943 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20"} err="failed to get container status \"d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\": rpc error: code = NotFound desc = could not find container \"d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20\": container with ID starting with d26e633e0a41a5f44c577b8e1a9494d58bb84b49015a7612684fc81e5872fd20 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.276978 5025 scope.go:117] "RemoveContainer" containerID="aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.277330 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4"} err="failed to get container status \"aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\": rpc error: code = NotFound desc = could not find container \"aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4\": container with ID starting with aca416ed3fc89df9948a2f7b02101dfb3bae8d8b22b7c4d270559a4cb6895dc4 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.277360 5025 scope.go:117] "RemoveContainer" containerID="4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.277568 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429"} err="failed to get container status \"4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\": rpc error: code = NotFound desc = could not find container \"4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429\": container with ID starting with 4aca8d824b0935ba844137095ac849a38a15dc5c0dc808fae28afbff543f6429 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.277591 5025 scope.go:117] "RemoveContainer" containerID="d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.277889 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f"} err="failed to get container status \"d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\": rpc error: code = NotFound desc = could not find container \"d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f\": container with ID starting with d54834b1ba5650aa397affa83e56e871bbbea4b8c383627ae00bea648595864f not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.277910 5025 scope.go:117] "RemoveContainer" containerID="602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.278118 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf"} err="failed to get container status \"602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\": rpc error: code = NotFound desc = could not find container \"602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf\": container with ID starting with 602fe6537fb9b526977c2d6ae30458382313c2b5f1506e6b7ffe684ffe5be2cf not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.278139 5025 scope.go:117] "RemoveContainer" containerID="e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.278358 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f"} err="failed to get container status \"e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\": rpc error: code = NotFound desc = could not find container \"e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f\": container with ID starting with e826406584133efe2a695c4c7ff16dc6919a1ba0cda077273f6e1eafbf0f5c3f not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.278383 5025 scope.go:117] "RemoveContainer" containerID="81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.278658 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe"} err="failed to get container status \"81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\": rpc error: code = NotFound desc = could not find container \"81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe\": container with ID starting with 81fc1872da168050a2483e46a2ce55f2f745eca15c901a77783a830609f6c0fe not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.278680 5025 scope.go:117] "RemoveContainer" containerID="cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.279185 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58"} err="failed to get container status \"cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\": rpc error: code = NotFound desc = could not find container \"cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58\": container with ID starting with cbf54412bb08ccb9b9e06d46a6b0d91195acc19b579da09dbbca737efc4e3c58 not found: ID does not exist" Oct 07 08:27:57 crc kubenswrapper[5025]: I1007 08:27:57.921212 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b6b9c75-ecfe-4815-b279-bb56f57a82a8" path="/var/lib/kubelet/pods/8b6b9c75-ecfe-4815-b279-bb56f57a82a8/volumes" Oct 07 08:27:58 crc kubenswrapper[5025]: I1007 08:27:58.043781 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerStarted","Data":"83ff32c3295b6a9b2a62d69dd1804a8617da7cd333d9619c9295c0d875692e99"} Oct 07 08:27:58 crc kubenswrapper[5025]: I1007 08:27:58.043824 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerStarted","Data":"f61edc00b3d14815e160b4b740df7ece164d562bfb97922b5afe6d54ee2b467d"} Oct 07 08:27:58 crc kubenswrapper[5025]: I1007 08:27:58.043833 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerStarted","Data":"1170765f5a32792a7236f312af6a48a4539377115b116604a1370e9972d2358c"} Oct 07 08:27:58 crc kubenswrapper[5025]: I1007 08:27:58.043844 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerStarted","Data":"2c1bc43aa11577b5dee007d05be3d6b47289c288c4cf1ab5c3b3f0a2a4fd1ddb"} Oct 07 08:27:58 crc kubenswrapper[5025]: I1007 08:27:58.043855 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerStarted","Data":"f10fb5924a3d1a2ee731efa2050b1de728c740e589752d6ab72e93e52eb9118a"} Oct 07 08:27:58 crc kubenswrapper[5025]: I1007 08:27:58.043866 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerStarted","Data":"7aaf8ffbc913f3ca52fb514718825b1713024c9c9cff63252455469c1b5ccfda"} Oct 07 08:28:00 crc kubenswrapper[5025]: I1007 08:28:00.059036 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerStarted","Data":"186470af6e5e9eebaaf0d2034242d0c9353d99ee25d7dde162c159bf7ac8dc1e"} Oct 07 08:28:03 crc kubenswrapper[5025]: I1007 08:28:03.066238 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x"] Oct 07 08:28:03 crc kubenswrapper[5025]: I1007 08:28:03.067026 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:03 crc kubenswrapper[5025]: I1007 08:28:03.067438 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:03 crc kubenswrapper[5025]: I1007 08:28:03.089002 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" event={"ID":"14b4b7dd-3970-480d-9aae-8c1e58cc2316","Type":"ContainerStarted","Data":"675846d535f26a818ab661916e0126140e527c7f8e2ba93c82d781fa23250bfe"} Oct 07 08:28:03 crc kubenswrapper[5025]: I1007 08:28:03.090173 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:28:03 crc kubenswrapper[5025]: E1007 08:28:03.102318 5025 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(5b1270b0a0c93a9448df9174fe5834996ce66cd7fb97667f03bbcc580d2f30ca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 08:28:03 crc kubenswrapper[5025]: E1007 08:28:03.102397 5025 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(5b1270b0a0c93a9448df9174fe5834996ce66cd7fb97667f03bbcc580d2f30ca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:03 crc kubenswrapper[5025]: E1007 08:28:03.102423 5025 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(5b1270b0a0c93a9448df9174fe5834996ce66cd7fb97667f03bbcc580d2f30ca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:03 crc kubenswrapper[5025]: E1007 08:28:03.102496 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace(e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace(e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(5b1270b0a0c93a9448df9174fe5834996ce66cd7fb97667f03bbcc580d2f30ca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" Oct 07 08:28:03 crc kubenswrapper[5025]: I1007 08:28:03.127607 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:28:03 crc kubenswrapper[5025]: I1007 08:28:03.164187 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" podStartSLOduration=7.16416976 podStartE2EDuration="7.16416976s" podCreationTimestamp="2025-10-07 08:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:28:03.122590451 +0000 UTC m=+689.931904605" watchObservedRunningTime="2025-10-07 08:28:03.16416976 +0000 UTC m=+689.973483904" Oct 07 08:28:04 crc kubenswrapper[5025]: I1007 08:28:04.098378 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:28:04 crc kubenswrapper[5025]: I1007 08:28:04.098479 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:28:04 crc kubenswrapper[5025]: I1007 08:28:04.143530 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:28:06 crc kubenswrapper[5025]: I1007 08:28:06.914302 5025 scope.go:117] "RemoveContainer" containerID="74509613ab9ea3b6e4feae1711cea3c8b8082ee266d8f3e819add82c25e9eec3" Oct 07 08:28:06 crc kubenswrapper[5025]: E1007 08:28:06.915088 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xmhw6_openshift-multus(34b07a69-1bbf-4019-b824-7b5be0f9404d)\"" pod="openshift-multus/multus-xmhw6" podUID="34b07a69-1bbf-4019-b824-7b5be0f9404d" Oct 07 08:28:13 crc kubenswrapper[5025]: I1007 08:28:13.916022 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:13 crc kubenswrapper[5025]: I1007 08:28:13.916535 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:13 crc kubenswrapper[5025]: E1007 08:28:13.945841 5025 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(011a3364540ddb70d91c72432266eb76df09d66933f3ca6fd09dd1f60e5ab821): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 08:28:13 crc kubenswrapper[5025]: E1007 08:28:13.946301 5025 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(011a3364540ddb70d91c72432266eb76df09d66933f3ca6fd09dd1f60e5ab821): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:13 crc kubenswrapper[5025]: E1007 08:28:13.946396 5025 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(011a3364540ddb70d91c72432266eb76df09d66933f3ca6fd09dd1f60e5ab821): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:13 crc kubenswrapper[5025]: E1007 08:28:13.946516 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace(e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace(e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_openshift-marketplace_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454_0(011a3364540ddb70d91c72432266eb76df09d66933f3ca6fd09dd1f60e5ab821): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" Oct 07 08:28:18 crc kubenswrapper[5025]: I1007 08:28:18.914883 5025 scope.go:117] "RemoveContainer" containerID="74509613ab9ea3b6e4feae1711cea3c8b8082ee266d8f3e819add82c25e9eec3" Oct 07 08:28:19 crc kubenswrapper[5025]: I1007 08:28:19.213400 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmhw6_34b07a69-1bbf-4019-b824-7b5be0f9404d/kube-multus/2.log" Oct 07 08:28:19 crc kubenswrapper[5025]: I1007 08:28:19.213853 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmhw6" event={"ID":"34b07a69-1bbf-4019-b824-7b5be0f9404d","Type":"ContainerStarted","Data":"faa65042ceb0a8b83ee9826aa0b3907371dfbac6196828861d56daaf634cc08e"} Oct 07 08:28:26 crc kubenswrapper[5025]: I1007 08:28:26.461856 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-llbmv" Oct 07 08:28:26 crc kubenswrapper[5025]: I1007 08:28:26.914384 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:26 crc kubenswrapper[5025]: I1007 08:28:26.915215 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:27 crc kubenswrapper[5025]: I1007 08:28:27.185895 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x"] Oct 07 08:28:27 crc kubenswrapper[5025]: I1007 08:28:27.271997 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" event={"ID":"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454","Type":"ContainerStarted","Data":"f1dfe71ce1bd1187a847e8c5fd8428a8c575f6ff3cbbc4c8d2c1c9e905182bf1"} Oct 07 08:28:28 crc kubenswrapper[5025]: I1007 08:28:28.279975 5025 generic.go:334] "Generic (PLEG): container finished" podID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerID="069f6af2aa9439ab6105f27a5bb45e9ded716a9c9d18aa9d5add391a8cb9ede6" exitCode=0 Oct 07 08:28:28 crc kubenswrapper[5025]: I1007 08:28:28.280097 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" event={"ID":"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454","Type":"ContainerDied","Data":"069f6af2aa9439ab6105f27a5bb45e9ded716a9c9d18aa9d5add391a8cb9ede6"} Oct 07 08:28:30 crc kubenswrapper[5025]: I1007 08:28:30.294524 5025 generic.go:334] "Generic (PLEG): container finished" podID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerID="4007278e20b60de06f2e0e249f43f275e8e0bd53907645f4214e6b1510dec549" exitCode=0 Oct 07 08:28:30 crc kubenswrapper[5025]: I1007 08:28:30.294641 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" event={"ID":"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454","Type":"ContainerDied","Data":"4007278e20b60de06f2e0e249f43f275e8e0bd53907645f4214e6b1510dec549"} Oct 07 08:28:31 crc kubenswrapper[5025]: I1007 08:28:31.303733 5025 generic.go:334] "Generic (PLEG): container finished" podID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerID="63d855cd2b406a67082a2777019b1035801394c028891300fe02f7deafdba152" exitCode=0 Oct 07 08:28:31 crc kubenswrapper[5025]: I1007 08:28:31.303781 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" event={"ID":"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454","Type":"ContainerDied","Data":"63d855cd2b406a67082a2777019b1035801394c028891300fe02f7deafdba152"} Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.602778 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.679238 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-util\") pod \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.679494 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-bundle\") pod \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.679636 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp5gj\" (UniqueName: \"kubernetes.io/projected/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-kube-api-access-kp5gj\") pod \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\" (UID: \"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454\") " Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.680077 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-bundle" (OuterVolumeSpecName: "bundle") pod "e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" (UID: "e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.691925 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-kube-api-access-kp5gj" (OuterVolumeSpecName: "kube-api-access-kp5gj") pod "e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" (UID: "e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454"). InnerVolumeSpecName "kube-api-access-kp5gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.780912 5025 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.781165 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp5gj\" (UniqueName: \"kubernetes.io/projected/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-kube-api-access-kp5gj\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.955692 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-util" (OuterVolumeSpecName: "util") pod "e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" (UID: "e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:28:32 crc kubenswrapper[5025]: I1007 08:28:32.983725 5025 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454-util\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:33 crc kubenswrapper[5025]: I1007 08:28:33.322039 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" event={"ID":"e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454","Type":"ContainerDied","Data":"f1dfe71ce1bd1187a847e8c5fd8428a8c575f6ff3cbbc4c8d2c1c9e905182bf1"} Oct 07 08:28:33 crc kubenswrapper[5025]: I1007 08:28:33.322339 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1dfe71ce1bd1187a847e8c5fd8428a8c575f6ff3cbbc4c8d2c1c9e905182bf1" Oct 07 08:28:33 crc kubenswrapper[5025]: I1007 08:28:33.322120 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.654277 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-rtt98"] Oct 07 08:28:38 crc kubenswrapper[5025]: E1007 08:28:38.654898 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerName="util" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.654914 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerName="util" Oct 07 08:28:38 crc kubenswrapper[5025]: E1007 08:28:38.654934 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerName="pull" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.654942 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerName="pull" Oct 07 08:28:38 crc kubenswrapper[5025]: E1007 08:28:38.654956 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerName="extract" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.654963 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerName="extract" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.655078 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454" containerName="extract" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.655557 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rtt98" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.657075 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-dv9kr" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.657311 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.657434 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.667171 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-rtt98"] Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.760832 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rcm7\" (UniqueName: \"kubernetes.io/projected/4ae6ad94-b065-4bd8-a19c-50adb890e53a-kube-api-access-8rcm7\") pod \"nmstate-operator-858ddd8f98-rtt98\" (UID: \"4ae6ad94-b065-4bd8-a19c-50adb890e53a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-rtt98" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.861905 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rcm7\" (UniqueName: \"kubernetes.io/projected/4ae6ad94-b065-4bd8-a19c-50adb890e53a-kube-api-access-8rcm7\") pod \"nmstate-operator-858ddd8f98-rtt98\" (UID: \"4ae6ad94-b065-4bd8-a19c-50adb890e53a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-rtt98" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.878646 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rcm7\" (UniqueName: \"kubernetes.io/projected/4ae6ad94-b065-4bd8-a19c-50adb890e53a-kube-api-access-8rcm7\") pod \"nmstate-operator-858ddd8f98-rtt98\" (UID: \"4ae6ad94-b065-4bd8-a19c-50adb890e53a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-rtt98" Oct 07 08:28:38 crc kubenswrapper[5025]: I1007 08:28:38.973114 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rtt98" Oct 07 08:28:39 crc kubenswrapper[5025]: I1007 08:28:39.369842 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-rtt98"] Oct 07 08:28:39 crc kubenswrapper[5025]: W1007 08:28:39.376196 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae6ad94_b065_4bd8_a19c_50adb890e53a.slice/crio-8339670e996cd65564f6011b9bdd7431b9faec9189a31f97289ff5c9f28ffead WatchSource:0}: Error finding container 8339670e996cd65564f6011b9bdd7431b9faec9189a31f97289ff5c9f28ffead: Status 404 returned error can't find the container with id 8339670e996cd65564f6011b9bdd7431b9faec9189a31f97289ff5c9f28ffead Oct 07 08:28:40 crc kubenswrapper[5025]: I1007 08:28:40.370936 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rtt98" event={"ID":"4ae6ad94-b065-4bd8-a19c-50adb890e53a","Type":"ContainerStarted","Data":"8339670e996cd65564f6011b9bdd7431b9faec9189a31f97289ff5c9f28ffead"} Oct 07 08:28:42 crc kubenswrapper[5025]: I1007 08:28:42.383142 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rtt98" event={"ID":"4ae6ad94-b065-4bd8-a19c-50adb890e53a","Type":"ContainerStarted","Data":"bbdaa77fb8bb09eb4b7811a74f10fdef5c6b254f00a99060f994c155a2a783ff"} Oct 07 08:28:42 crc kubenswrapper[5025]: I1007 08:28:42.405293 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rtt98" podStartSLOduration=2.358095719 podStartE2EDuration="4.405270184s" podCreationTimestamp="2025-10-07 08:28:38 +0000 UTC" firstStartedPulling="2025-10-07 08:28:39.378643792 +0000 UTC m=+726.187957936" lastFinishedPulling="2025-10-07 08:28:41.425818257 +0000 UTC m=+728.235132401" observedRunningTime="2025-10-07 08:28:42.403604522 +0000 UTC m=+729.212918696" watchObservedRunningTime="2025-10-07 08:28:42.405270184 +0000 UTC m=+729.214584338" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.573670 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t"] Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.575278 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.577342 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xdtbb" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.586450 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t"] Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.603817 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9"] Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.605007 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.607002 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.624442 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9"] Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.633777 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-z4897"] Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.650754 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.675460 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9a5212ab-0cf6-45f9-9369-3cc46265ac46-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6jtt9\" (UID: \"9a5212ab-0cf6-45f9-9369-3cc46265ac46\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.675615 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjwk8\" (UniqueName: \"kubernetes.io/projected/9a5212ab-0cf6-45f9-9369-3cc46265ac46-kube-api-access-fjwk8\") pod \"nmstate-webhook-6cdbc54649-6jtt9\" (UID: \"9a5212ab-0cf6-45f9-9369-3cc46265ac46\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.675757 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtqpx\" (UniqueName: \"kubernetes.io/projected/aa3b53d3-e235-4259-9bc4-faa4257df0b7-kube-api-access-rtqpx\") pod \"nmstate-metrics-fdff9cb8d-lw65t\" (UID: \"aa3b53d3-e235-4259-9bc4-faa4257df0b7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.721208 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr"] Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.722172 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.724127 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.724352 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5nzb6" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.724535 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.731840 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr"] Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.776832 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj9rm\" (UniqueName: \"kubernetes.io/projected/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-kube-api-access-vj9rm\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.776873 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-dbus-socket\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.776892 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-nmstate-lock\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.776922 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9a5212ab-0cf6-45f9-9369-3cc46265ac46-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6jtt9\" (UID: \"9a5212ab-0cf6-45f9-9369-3cc46265ac46\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.776953 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjwk8\" (UniqueName: \"kubernetes.io/projected/9a5212ab-0cf6-45f9-9369-3cc46265ac46-kube-api-access-fjwk8\") pod \"nmstate-webhook-6cdbc54649-6jtt9\" (UID: \"9a5212ab-0cf6-45f9-9369-3cc46265ac46\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.776990 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtqpx\" (UniqueName: \"kubernetes.io/projected/aa3b53d3-e235-4259-9bc4-faa4257df0b7-kube-api-access-rtqpx\") pod \"nmstate-metrics-fdff9cb8d-lw65t\" (UID: \"aa3b53d3-e235-4259-9bc4-faa4257df0b7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.777012 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtwr6\" (UniqueName: \"kubernetes.io/projected/a6af412f-9cd9-468b-b656-25fa3a52f24b-kube-api-access-jtwr6\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.777029 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-ovs-socket\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.777046 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6af412f-9cd9-468b-b656-25fa3a52f24b-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.777062 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a6af412f-9cd9-468b-b656-25fa3a52f24b-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:47 crc kubenswrapper[5025]: E1007 08:28:47.777194 5025 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 07 08:28:47 crc kubenswrapper[5025]: E1007 08:28:47.777243 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a5212ab-0cf6-45f9-9369-3cc46265ac46-tls-key-pair podName:9a5212ab-0cf6-45f9-9369-3cc46265ac46 nodeName:}" failed. No retries permitted until 2025-10-07 08:28:48.277224244 +0000 UTC m=+735.086538388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/9a5212ab-0cf6-45f9-9369-3cc46265ac46-tls-key-pair") pod "nmstate-webhook-6cdbc54649-6jtt9" (UID: "9a5212ab-0cf6-45f9-9369-3cc46265ac46") : secret "openshift-nmstate-webhook" not found Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.794924 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjwk8\" (UniqueName: \"kubernetes.io/projected/9a5212ab-0cf6-45f9-9369-3cc46265ac46-kube-api-access-fjwk8\") pod \"nmstate-webhook-6cdbc54649-6jtt9\" (UID: \"9a5212ab-0cf6-45f9-9369-3cc46265ac46\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.798061 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtqpx\" (UniqueName: \"kubernetes.io/projected/aa3b53d3-e235-4259-9bc4-faa4257df0b7-kube-api-access-rtqpx\") pod \"nmstate-metrics-fdff9cb8d-lw65t\" (UID: \"aa3b53d3-e235-4259-9bc4-faa4257df0b7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878132 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj9rm\" (UniqueName: \"kubernetes.io/projected/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-kube-api-access-vj9rm\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878187 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-dbus-socket\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878207 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-nmstate-lock\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878281 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtwr6\" (UniqueName: \"kubernetes.io/projected/a6af412f-9cd9-468b-b656-25fa3a52f24b-kube-api-access-jtwr6\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878298 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-ovs-socket\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878315 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6af412f-9cd9-468b-b656-25fa3a52f24b-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878331 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a6af412f-9cd9-468b-b656-25fa3a52f24b-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878639 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-dbus-socket\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878670 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-nmstate-lock\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.878728 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-ovs-socket\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: E1007 08:28:47.878733 5025 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 07 08:28:47 crc kubenswrapper[5025]: E1007 08:28:47.878774 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6af412f-9cd9-468b-b656-25fa3a52f24b-plugin-serving-cert podName:a6af412f-9cd9-468b-b656-25fa3a52f24b nodeName:}" failed. No retries permitted until 2025-10-07 08:28:48.378759874 +0000 UTC m=+735.188074018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/a6af412f-9cd9-468b-b656-25fa3a52f24b-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-m2scr" (UID: "a6af412f-9cd9-468b-b656-25fa3a52f24b") : secret "plugin-serving-cert" not found Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.879229 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a6af412f-9cd9-468b-b656-25fa3a52f24b-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.897246 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtwr6\" (UniqueName: \"kubernetes.io/projected/a6af412f-9cd9-468b-b656-25fa3a52f24b-kube-api-access-jtwr6\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.900105 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj9rm\" (UniqueName: \"kubernetes.io/projected/a0279a96-994a-4bc6-b4e1-c53e4efb08d6-kube-api-access-vj9rm\") pod \"nmstate-handler-z4897\" (UID: \"a0279a96-994a-4bc6-b4e1-c53e4efb08d6\") " pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.907331 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-59758fc5-6b4gs"] Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.907951 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.908474 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.959503 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59758fc5-6b4gs"] Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.978961 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-console-config\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.979138 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-oauth-serving-cert\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.979163 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64837a88-0a5e-41b3-95bf-78a62cc0f599-console-oauth-config\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.979196 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-trusted-ca-bundle\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.979227 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8nsr\" (UniqueName: \"kubernetes.io/projected/64837a88-0a5e-41b3-95bf-78a62cc0f599-kube-api-access-r8nsr\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.979266 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64837a88-0a5e-41b3-95bf-78a62cc0f599-console-serving-cert\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.979279 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-service-ca\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:47 crc kubenswrapper[5025]: I1007 08:28:47.979308 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.081891 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-oauth-serving-cert\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.082224 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64837a88-0a5e-41b3-95bf-78a62cc0f599-console-oauth-config\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.082247 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-trusted-ca-bundle\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.082279 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8nsr\" (UniqueName: \"kubernetes.io/projected/64837a88-0a5e-41b3-95bf-78a62cc0f599-kube-api-access-r8nsr\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.082310 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64837a88-0a5e-41b3-95bf-78a62cc0f599-console-serving-cert\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.082327 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-service-ca\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.082346 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-console-config\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.083021 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-console-config\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.083247 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-oauth-serving-cert\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.083567 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-trusted-ca-bundle\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.083621 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64837a88-0a5e-41b3-95bf-78a62cc0f599-service-ca\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.089220 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64837a88-0a5e-41b3-95bf-78a62cc0f599-console-serving-cert\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.089267 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64837a88-0a5e-41b3-95bf-78a62cc0f599-console-oauth-config\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.097460 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8nsr\" (UniqueName: \"kubernetes.io/projected/64837a88-0a5e-41b3-95bf-78a62cc0f599-kube-api-access-r8nsr\") pod \"console-59758fc5-6b4gs\" (UID: \"64837a88-0a5e-41b3-95bf-78a62cc0f599\") " pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.136019 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t"] Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.270160 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.286129 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9a5212ab-0cf6-45f9-9369-3cc46265ac46-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6jtt9\" (UID: \"9a5212ab-0cf6-45f9-9369-3cc46265ac46\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.291156 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9a5212ab-0cf6-45f9-9369-3cc46265ac46-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6jtt9\" (UID: \"9a5212ab-0cf6-45f9-9369-3cc46265ac46\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.387798 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6af412f-9cd9-468b-b656-25fa3a52f24b-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.393270 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6af412f-9cd9-468b-b656-25fa3a52f24b-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-m2scr\" (UID: \"a6af412f-9cd9-468b-b656-25fa3a52f24b\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.418413 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t" event={"ID":"aa3b53d3-e235-4259-9bc4-faa4257df0b7","Type":"ContainerStarted","Data":"19b9222fd712cd3a002a5161524f15da91b1a360c0261d6b53c96828bb6f7d8b"} Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.419675 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z4897" event={"ID":"a0279a96-994a-4bc6-b4e1-c53e4efb08d6","Type":"ContainerStarted","Data":"9e37eacf353acd4aee73504f5148e672648c2e750ef3aa7d69f6322b6af64f87"} Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.534669 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.639800 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.729146 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59758fc5-6b4gs"] Oct 07 08:28:48 crc kubenswrapper[5025]: W1007 08:28:48.745781 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64837a88_0a5e_41b3_95bf_78a62cc0f599.slice/crio-904a07223c8695acbd622cbc2341ee3a683369f03ffe8a01ea91265bb5e1df71 WatchSource:0}: Error finding container 904a07223c8695acbd622cbc2341ee3a683369f03ffe8a01ea91265bb5e1df71: Status 404 returned error can't find the container with id 904a07223c8695acbd622cbc2341ee3a683369f03ffe8a01ea91265bb5e1df71 Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.888240 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr"] Oct 07 08:28:48 crc kubenswrapper[5025]: W1007 08:28:48.893795 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6af412f_9cd9_468b_b656_25fa3a52f24b.slice/crio-a3b966609d8306c047a4280949041e5618bfe3ca2533596d78a3908795a779e3 WatchSource:0}: Error finding container a3b966609d8306c047a4280949041e5618bfe3ca2533596d78a3908795a779e3: Status 404 returned error can't find the container with id a3b966609d8306c047a4280949041e5618bfe3ca2533596d78a3908795a779e3 Oct 07 08:28:48 crc kubenswrapper[5025]: I1007 08:28:48.963273 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9"] Oct 07 08:28:48 crc kubenswrapper[5025]: W1007 08:28:48.968632 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a5212ab_0cf6_45f9_9369_3cc46265ac46.slice/crio-4e09209a30d44f57bf2b6450717d781d6bd1a34fc9ce39d68032ed2fd3036ec6 WatchSource:0}: Error finding container 4e09209a30d44f57bf2b6450717d781d6bd1a34fc9ce39d68032ed2fd3036ec6: Status 404 returned error can't find the container with id 4e09209a30d44f57bf2b6450717d781d6bd1a34fc9ce39d68032ed2fd3036ec6 Oct 07 08:28:49 crc kubenswrapper[5025]: I1007 08:28:49.426771 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" event={"ID":"9a5212ab-0cf6-45f9-9369-3cc46265ac46","Type":"ContainerStarted","Data":"4e09209a30d44f57bf2b6450717d781d6bd1a34fc9ce39d68032ed2fd3036ec6"} Oct 07 08:28:49 crc kubenswrapper[5025]: I1007 08:28:49.428045 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" event={"ID":"a6af412f-9cd9-468b-b656-25fa3a52f24b","Type":"ContainerStarted","Data":"a3b966609d8306c047a4280949041e5618bfe3ca2533596d78a3908795a779e3"} Oct 07 08:28:49 crc kubenswrapper[5025]: I1007 08:28:49.429833 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59758fc5-6b4gs" event={"ID":"64837a88-0a5e-41b3-95bf-78a62cc0f599","Type":"ContainerStarted","Data":"e94aeaea2e95ed4f9d51d4dcfd47f295114e2bb74794751d478df14a42af0c4d"} Oct 07 08:28:49 crc kubenswrapper[5025]: I1007 08:28:49.429862 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59758fc5-6b4gs" event={"ID":"64837a88-0a5e-41b3-95bf-78a62cc0f599","Type":"ContainerStarted","Data":"904a07223c8695acbd622cbc2341ee3a683369f03ffe8a01ea91265bb5e1df71"} Oct 07 08:28:49 crc kubenswrapper[5025]: I1007 08:28:49.458040 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59758fc5-6b4gs" podStartSLOduration=2.458019949 podStartE2EDuration="2.458019949s" podCreationTimestamp="2025-10-07 08:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:28:49.453447745 +0000 UTC m=+736.262761889" watchObservedRunningTime="2025-10-07 08:28:49.458019949 +0000 UTC m=+736.267334093" Oct 07 08:28:51 crc kubenswrapper[5025]: I1007 08:28:51.442364 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" event={"ID":"9a5212ab-0cf6-45f9-9369-3cc46265ac46","Type":"ContainerStarted","Data":"ed43ddb92af836917a3eb7418a00721cbf315a69d6d20a6dc00e03f5f7715b21"} Oct 07 08:28:51 crc kubenswrapper[5025]: I1007 08:28:51.443724 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:28:51 crc kubenswrapper[5025]: I1007 08:28:51.444087 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t" event={"ID":"aa3b53d3-e235-4259-9bc4-faa4257df0b7","Type":"ContainerStarted","Data":"a7c8cae6c272fef4dda8fb583a7c5b421d39f28419c1f88d09386dec90086095"} Oct 07 08:28:51 crc kubenswrapper[5025]: I1007 08:28:51.446058 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z4897" event={"ID":"a0279a96-994a-4bc6-b4e1-c53e4efb08d6","Type":"ContainerStarted","Data":"0d5ed6ce3c26d67b7dababc131ac8f472fe7e4940f47340df6479d52defbee96"} Oct 07 08:28:51 crc kubenswrapper[5025]: I1007 08:28:51.446586 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:51 crc kubenswrapper[5025]: I1007 08:28:51.460180 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" podStartSLOduration=2.957089883 podStartE2EDuration="4.46015558s" podCreationTimestamp="2025-10-07 08:28:47 +0000 UTC" firstStartedPulling="2025-10-07 08:28:48.973367111 +0000 UTC m=+735.782681255" lastFinishedPulling="2025-10-07 08:28:50.476432808 +0000 UTC m=+737.285746952" observedRunningTime="2025-10-07 08:28:51.456720411 +0000 UTC m=+738.266034555" watchObservedRunningTime="2025-10-07 08:28:51.46015558 +0000 UTC m=+738.269469764" Oct 07 08:28:51 crc kubenswrapper[5025]: I1007 08:28:51.474914 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" podStartSLOduration=2.086217288 podStartE2EDuration="4.474894076s" podCreationTimestamp="2025-10-07 08:28:47 +0000 UTC" firstStartedPulling="2025-10-07 08:28:48.897007968 +0000 UTC m=+735.706322112" lastFinishedPulling="2025-10-07 08:28:51.285684736 +0000 UTC m=+738.094998900" observedRunningTime="2025-10-07 08:28:51.468803443 +0000 UTC m=+738.278117617" watchObservedRunningTime="2025-10-07 08:28:51.474894076 +0000 UTC m=+738.284208230" Oct 07 08:28:51 crc kubenswrapper[5025]: I1007 08:28:51.484503 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-z4897" podStartSLOduration=2.032283533 podStartE2EDuration="4.484484819s" podCreationTimestamp="2025-10-07 08:28:47 +0000 UTC" firstStartedPulling="2025-10-07 08:28:48.022732604 +0000 UTC m=+734.832046748" lastFinishedPulling="2025-10-07 08:28:50.47493389 +0000 UTC m=+737.284248034" observedRunningTime="2025-10-07 08:28:51.484205271 +0000 UTC m=+738.293519415" watchObservedRunningTime="2025-10-07 08:28:51.484484819 +0000 UTC m=+738.293798973" Oct 07 08:28:52 crc kubenswrapper[5025]: I1007 08:28:52.455514 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m2scr" event={"ID":"a6af412f-9cd9-468b-b656-25fa3a52f24b","Type":"ContainerStarted","Data":"066e82e809ceb0693d63e505df3c8216d4b195785d1927e40ee78b9fa795eab4"} Oct 07 08:28:53 crc kubenswrapper[5025]: I1007 08:28:53.465740 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t" event={"ID":"aa3b53d3-e235-4259-9bc4-faa4257df0b7","Type":"ContainerStarted","Data":"e21d226ad10920aa0e5818c5368b313c59aa984fcad4c40352c1d6edd9711060"} Oct 07 08:28:53 crc kubenswrapper[5025]: I1007 08:28:53.486057 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-lw65t" podStartSLOduration=1.71896397 podStartE2EDuration="6.486042642s" podCreationTimestamp="2025-10-07 08:28:47 +0000 UTC" firstStartedPulling="2025-10-07 08:28:48.132135872 +0000 UTC m=+734.941450016" lastFinishedPulling="2025-10-07 08:28:52.899214504 +0000 UTC m=+739.708528688" observedRunningTime="2025-10-07 08:28:53.483269274 +0000 UTC m=+740.292583448" watchObservedRunningTime="2025-10-07 08:28:53.486042642 +0000 UTC m=+740.295356786" Oct 07 08:28:58 crc kubenswrapper[5025]: I1007 08:28:58.012206 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-z4897" Oct 07 08:28:58 crc kubenswrapper[5025]: I1007 08:28:58.271173 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:58 crc kubenswrapper[5025]: I1007 08:28:58.271453 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:58 crc kubenswrapper[5025]: I1007 08:28:58.276746 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:58 crc kubenswrapper[5025]: I1007 08:28:58.501073 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59758fc5-6b4gs" Oct 07 08:28:58 crc kubenswrapper[5025]: I1007 08:28:58.539141 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hmlpq"] Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.075899 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2xfws"] Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.076089 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" podUID="0f09299f-24a4-4d5f-8ca9-704c678b8d23" containerName="controller-manager" containerID="cri-o://f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d" gracePeriod=30 Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.189156 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm"] Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.189354 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" podUID="6dca6d86-e6aa-455e-88e8-f61e829b7efd" containerName="route-controller-manager" containerID="cri-o://e936f73aaf0e98b82a887ed5aca95a08e95ac4d1f0ac49da7719f5275323b888" gracePeriod=30 Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.422920 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.440132 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f09299f-24a4-4d5f-8ca9-704c678b8d23-serving-cert\") pod \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.440226 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fbrj\" (UniqueName: \"kubernetes.io/projected/0f09299f-24a4-4d5f-8ca9-704c678b8d23-kube-api-access-9fbrj\") pod \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.440261 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-proxy-ca-bundles\") pod \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.440293 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-client-ca\") pod \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.440391 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-config\") pod \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\" (UID: \"0f09299f-24a4-4d5f-8ca9-704c678b8d23\") " Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.443435 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0f09299f-24a4-4d5f-8ca9-704c678b8d23" (UID: "0f09299f-24a4-4d5f-8ca9-704c678b8d23"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.444834 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-client-ca" (OuterVolumeSpecName: "client-ca") pod "0f09299f-24a4-4d5f-8ca9-704c678b8d23" (UID: "0f09299f-24a4-4d5f-8ca9-704c678b8d23"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.444942 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-config" (OuterVolumeSpecName: "config") pod "0f09299f-24a4-4d5f-8ca9-704c678b8d23" (UID: "0f09299f-24a4-4d5f-8ca9-704c678b8d23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.451036 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f09299f-24a4-4d5f-8ca9-704c678b8d23-kube-api-access-9fbrj" (OuterVolumeSpecName: "kube-api-access-9fbrj") pod "0f09299f-24a4-4d5f-8ca9-704c678b8d23" (UID: "0f09299f-24a4-4d5f-8ca9-704c678b8d23"). InnerVolumeSpecName "kube-api-access-9fbrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.452998 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f09299f-24a4-4d5f-8ca9-704c678b8d23-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0f09299f-24a4-4d5f-8ca9-704c678b8d23" (UID: "0f09299f-24a4-4d5f-8ca9-704c678b8d23"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.501427 5025 generic.go:334] "Generic (PLEG): container finished" podID="6dca6d86-e6aa-455e-88e8-f61e829b7efd" containerID="e936f73aaf0e98b82a887ed5aca95a08e95ac4d1f0ac49da7719f5275323b888" exitCode=0 Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.501504 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" event={"ID":"6dca6d86-e6aa-455e-88e8-f61e829b7efd","Type":"ContainerDied","Data":"e936f73aaf0e98b82a887ed5aca95a08e95ac4d1f0ac49da7719f5275323b888"} Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.503336 5025 generic.go:334] "Generic (PLEG): container finished" podID="0f09299f-24a4-4d5f-8ca9-704c678b8d23" containerID="f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d" exitCode=0 Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.503643 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.503966 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" event={"ID":"0f09299f-24a4-4d5f-8ca9-704c678b8d23","Type":"ContainerDied","Data":"f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d"} Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.503993 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2xfws" event={"ID":"0f09299f-24a4-4d5f-8ca9-704c678b8d23","Type":"ContainerDied","Data":"2d1aa061bc3e8407b74c12f6d52d62e36361948c9bf5e475cd844d562888e12d"} Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.504010 5025 scope.go:117] "RemoveContainer" containerID="f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.524591 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.535987 5025 scope.go:117] "RemoveContainer" containerID="f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d" Oct 07 08:28:59 crc kubenswrapper[5025]: E1007 08:28:59.536896 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d\": container with ID starting with f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d not found: ID does not exist" containerID="f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.536955 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d"} err="failed to get container status \"f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d\": rpc error: code = NotFound desc = could not find container \"f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d\": container with ID starting with f69f953959d07c1296d2c237c1bac0cf2de40888f4066294a16a54ec03b7992d not found: ID does not exist" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.538160 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2xfws"] Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.543224 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2xfws"] Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.543376 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xd7q\" (UniqueName: \"kubernetes.io/projected/6dca6d86-e6aa-455e-88e8-f61e829b7efd-kube-api-access-8xd7q\") pod \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.543452 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-config\") pod \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.544401 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-client-ca" (OuterVolumeSpecName: "client-ca") pod "6dca6d86-e6aa-455e-88e8-f61e829b7efd" (UID: "6dca6d86-e6aa-455e-88e8-f61e829b7efd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.543491 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-client-ca\") pod \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.544801 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dca6d86-e6aa-455e-88e8-f61e829b7efd-serving-cert\") pod \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\" (UID: \"6dca6d86-e6aa-455e-88e8-f61e829b7efd\") " Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.545065 5025 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.545082 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.545090 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f09299f-24a4-4d5f-8ca9-704c678b8d23-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.545098 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fbrj\" (UniqueName: \"kubernetes.io/projected/0f09299f-24a4-4d5f-8ca9-704c678b8d23-kube-api-access-9fbrj\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.545107 5025 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.545114 5025 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f09299f-24a4-4d5f-8ca9-704c678b8d23-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.545681 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-config" (OuterVolumeSpecName: "config") pod "6dca6d86-e6aa-455e-88e8-f61e829b7efd" (UID: "6dca6d86-e6aa-455e-88e8-f61e829b7efd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.546840 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dca6d86-e6aa-455e-88e8-f61e829b7efd-kube-api-access-8xd7q" (OuterVolumeSpecName: "kube-api-access-8xd7q") pod "6dca6d86-e6aa-455e-88e8-f61e829b7efd" (UID: "6dca6d86-e6aa-455e-88e8-f61e829b7efd"). InnerVolumeSpecName "kube-api-access-8xd7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.552006 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dca6d86-e6aa-455e-88e8-f61e829b7efd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6dca6d86-e6aa-455e-88e8-f61e829b7efd" (UID: "6dca6d86-e6aa-455e-88e8-f61e829b7efd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.646112 5025 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dca6d86-e6aa-455e-88e8-f61e829b7efd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.646144 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xd7q\" (UniqueName: \"kubernetes.io/projected/6dca6d86-e6aa-455e-88e8-f61e829b7efd-kube-api-access-8xd7q\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.646170 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca6d86-e6aa-455e-88e8-f61e829b7efd-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:28:59 crc kubenswrapper[5025]: I1007 08:28:59.922522 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f09299f-24a4-4d5f-8ca9-704c678b8d23" path="/var/lib/kubelet/pods/0f09299f-24a4-4d5f-8ca9-704c678b8d23/volumes" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.512925 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" event={"ID":"6dca6d86-e6aa-455e-88e8-f61e829b7efd","Type":"ContainerDied","Data":"184acddeddb7db04ff8b78422aabc067026fb8a3c545381b06bdc6803c302a0e"} Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.513497 5025 scope.go:117] "RemoveContainer" containerID="e936f73aaf0e98b82a887ed5aca95a08e95ac4d1f0ac49da7719f5275323b888" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.512979 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.539076 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm"] Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.541751 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vk2xm"] Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.872093 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh"] Oct 07 08:29:00 crc kubenswrapper[5025]: E1007 08:29:00.872383 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f09299f-24a4-4d5f-8ca9-704c678b8d23" containerName="controller-manager" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.872406 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f09299f-24a4-4d5f-8ca9-704c678b8d23" containerName="controller-manager" Oct 07 08:29:00 crc kubenswrapper[5025]: E1007 08:29:00.872438 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dca6d86-e6aa-455e-88e8-f61e829b7efd" containerName="route-controller-manager" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.872448 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dca6d86-e6aa-455e-88e8-f61e829b7efd" containerName="route-controller-manager" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.872642 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f09299f-24a4-4d5f-8ca9-704c678b8d23" containerName="controller-manager" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.872658 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dca6d86-e6aa-455e-88e8-f61e829b7efd" containerName="route-controller-manager" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.873160 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.874908 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.875006 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.875095 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.875144 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.875173 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.875660 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.879214 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc"] Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.881462 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.883668 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.884015 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.884692 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.884966 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.885233 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.885537 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.886745 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh"] Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.893963 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.896236 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc"] Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.964017 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jbwj\" (UniqueName: \"kubernetes.io/projected/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-kube-api-access-6jbwj\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.964111 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftmhw\" (UniqueName: \"kubernetes.io/projected/9d8063ae-c9eb-4636-84b6-c49970f56910-kube-api-access-ftmhw\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.964302 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-serving-cert\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.964404 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8063ae-c9eb-4636-84b6-c49970f56910-serving-cert\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.964520 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-proxy-ca-bundles\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.964598 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d8063ae-c9eb-4636-84b6-c49970f56910-client-ca\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.964762 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8063ae-c9eb-4636-84b6-c49970f56910-config\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.964825 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-config\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:00 crc kubenswrapper[5025]: I1007 08:29:00.964935 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-client-ca\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.066128 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftmhw\" (UniqueName: \"kubernetes.io/projected/9d8063ae-c9eb-4636-84b6-c49970f56910-kube-api-access-ftmhw\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.066197 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-serving-cert\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.066233 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8063ae-c9eb-4636-84b6-c49970f56910-serving-cert\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.066277 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-proxy-ca-bundles\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.066304 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d8063ae-c9eb-4636-84b6-c49970f56910-client-ca\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.066350 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8063ae-c9eb-4636-84b6-c49970f56910-config\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.066374 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-config\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.066402 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-client-ca\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.066443 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jbwj\" (UniqueName: \"kubernetes.io/projected/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-kube-api-access-6jbwj\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.068719 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8063ae-c9eb-4636-84b6-c49970f56910-config\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.068840 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-proxy-ca-bundles\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.069220 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d8063ae-c9eb-4636-84b6-c49970f56910-client-ca\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.069898 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-client-ca\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.070116 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-config\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.077323 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8063ae-c9eb-4636-84b6-c49970f56910-serving-cert\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.082633 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-serving-cert\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.102071 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftmhw\" (UniqueName: \"kubernetes.io/projected/9d8063ae-c9eb-4636-84b6-c49970f56910-kube-api-access-ftmhw\") pod \"route-controller-manager-689bc556d9-q9njh\" (UID: \"9d8063ae-c9eb-4636-84b6-c49970f56910\") " pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.108289 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jbwj\" (UniqueName: \"kubernetes.io/projected/44bc5f0e-b4e8-456a-b2a6-7e282e972c79-kube-api-access-6jbwj\") pod \"controller-manager-5dc5b7d6f-8c9lc\" (UID: \"44bc5f0e-b4e8-456a-b2a6-7e282e972c79\") " pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.193738 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.209760 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.496375 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh"] Oct 07 08:29:01 crc kubenswrapper[5025]: W1007 08:29:01.503435 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d8063ae_c9eb_4636_84b6_c49970f56910.slice/crio-26ab3852ae4f2cd46771c57e5930d18a95b187c99b99dd293e703e2cdfd057d8 WatchSource:0}: Error finding container 26ab3852ae4f2cd46771c57e5930d18a95b187c99b99dd293e703e2cdfd057d8: Status 404 returned error can't find the container with id 26ab3852ae4f2cd46771c57e5930d18a95b187c99b99dd293e703e2cdfd057d8 Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.528786 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" event={"ID":"9d8063ae-c9eb-4636-84b6-c49970f56910","Type":"ContainerStarted","Data":"26ab3852ae4f2cd46771c57e5930d18a95b187c99b99dd293e703e2cdfd057d8"} Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.756294 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc"] Oct 07 08:29:01 crc kubenswrapper[5025]: W1007 08:29:01.764827 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44bc5f0e_b4e8_456a_b2a6_7e282e972c79.slice/crio-616bd02af8756498951cd582de39841e3dadb0c36f0a9a35851e4a7d2036454b WatchSource:0}: Error finding container 616bd02af8756498951cd582de39841e3dadb0c36f0a9a35851e4a7d2036454b: Status 404 returned error can't find the container with id 616bd02af8756498951cd582de39841e3dadb0c36f0a9a35851e4a7d2036454b Oct 07 08:29:01 crc kubenswrapper[5025]: I1007 08:29:01.920956 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dca6d86-e6aa-455e-88e8-f61e829b7efd" path="/var/lib/kubelet/pods/6dca6d86-e6aa-455e-88e8-f61e829b7efd/volumes" Oct 07 08:29:02 crc kubenswrapper[5025]: I1007 08:29:02.534468 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" event={"ID":"9d8063ae-c9eb-4636-84b6-c49970f56910","Type":"ContainerStarted","Data":"d19a541cc7ba2907fdb73f0b76515ca3e9395cffd520a3e792426fcadbea7bd3"} Oct 07 08:29:02 crc kubenswrapper[5025]: I1007 08:29:02.534978 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:02 crc kubenswrapper[5025]: I1007 08:29:02.537413 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" event={"ID":"44bc5f0e-b4e8-456a-b2a6-7e282e972c79","Type":"ContainerStarted","Data":"96369778e59c9cf5ab14408684c32fc126af07508700937e4e776258ab36feff"} Oct 07 08:29:02 crc kubenswrapper[5025]: I1007 08:29:02.537460 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" event={"ID":"44bc5f0e-b4e8-456a-b2a6-7e282e972c79","Type":"ContainerStarted","Data":"616bd02af8756498951cd582de39841e3dadb0c36f0a9a35851e4a7d2036454b"} Oct 07 08:29:02 crc kubenswrapper[5025]: I1007 08:29:02.537831 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:02 crc kubenswrapper[5025]: I1007 08:29:02.540985 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" Oct 07 08:29:02 crc kubenswrapper[5025]: I1007 08:29:02.543935 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" Oct 07 08:29:02 crc kubenswrapper[5025]: I1007 08:29:02.553322 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-689bc556d9-q9njh" podStartSLOduration=3.553301689 podStartE2EDuration="3.553301689s" podCreationTimestamp="2025-10-07 08:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:29:02.548006721 +0000 UTC m=+749.357320865" watchObservedRunningTime="2025-10-07 08:29:02.553301689 +0000 UTC m=+749.362615843" Oct 07 08:29:02 crc kubenswrapper[5025]: I1007 08:29:02.568600 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5dc5b7d6f-8c9lc" podStartSLOduration=3.568572931 podStartE2EDuration="3.568572931s" podCreationTimestamp="2025-10-07 08:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:29:02.562554851 +0000 UTC m=+749.371869015" watchObservedRunningTime="2025-10-07 08:29:02.568572931 +0000 UTC m=+749.377887095" Oct 07 08:29:08 crc kubenswrapper[5025]: I1007 08:29:08.544019 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6jtt9" Oct 07 08:29:08 crc kubenswrapper[5025]: I1007 08:29:08.994969 5025 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.486641 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7"] Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.489333 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.492133 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.519585 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7"] Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.590617 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hmlpq" podUID="1e02ba47-76b7-4363-8ade-a9f8c42db920" containerName="console" containerID="cri-o://28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293" gracePeriod=15 Oct 07 08:29:23 crc kubenswrapper[5025]: E1007 08:29:23.667229 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e02ba47_76b7_4363_8ade_a9f8c42db920.slice/crio-28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293.scope\": RecentStats: unable to find data in memory cache]" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.676086 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78wg6\" (UniqueName: \"kubernetes.io/projected/a5712d23-aa3f-49ea-b448-af44d79c9701-kube-api-access-78wg6\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.676127 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.676162 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.777222 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78wg6\" (UniqueName: \"kubernetes.io/projected/a5712d23-aa3f-49ea-b448-af44d79c9701-kube-api-access-78wg6\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.777283 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.777322 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.777897 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.778002 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.798052 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78wg6\" (UniqueName: \"kubernetes.io/projected/a5712d23-aa3f-49ea-b448-af44d79c9701-kube-api-access-78wg6\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:23 crc kubenswrapper[5025]: I1007 08:29:23.812465 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.031048 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hmlpq_1e02ba47-76b7-4363-8ade-a9f8c42db920/console/0.log" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.031141 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.081349 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-oauth-serving-cert\") pod \"1e02ba47-76b7-4363-8ade-a9f8c42db920\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.081396 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-oauth-config\") pod \"1e02ba47-76b7-4363-8ade-a9f8c42db920\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.081425 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-trusted-ca-bundle\") pod \"1e02ba47-76b7-4363-8ade-a9f8c42db920\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.081452 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-serving-cert\") pod \"1e02ba47-76b7-4363-8ade-a9f8c42db920\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.081487 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-config\") pod \"1e02ba47-76b7-4363-8ade-a9f8c42db920\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.081530 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r89h4\" (UniqueName: \"kubernetes.io/projected/1e02ba47-76b7-4363-8ade-a9f8c42db920-kube-api-access-r89h4\") pod \"1e02ba47-76b7-4363-8ade-a9f8c42db920\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.081587 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-service-ca\") pod \"1e02ba47-76b7-4363-8ade-a9f8c42db920\" (UID: \"1e02ba47-76b7-4363-8ade-a9f8c42db920\") " Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.082463 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1e02ba47-76b7-4363-8ade-a9f8c42db920" (UID: "1e02ba47-76b7-4363-8ade-a9f8c42db920"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.082693 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-config" (OuterVolumeSpecName: "console-config") pod "1e02ba47-76b7-4363-8ade-a9f8c42db920" (UID: "1e02ba47-76b7-4363-8ade-a9f8c42db920"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.082856 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-service-ca" (OuterVolumeSpecName: "service-ca") pod "1e02ba47-76b7-4363-8ade-a9f8c42db920" (UID: "1e02ba47-76b7-4363-8ade-a9f8c42db920"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.083139 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1e02ba47-76b7-4363-8ade-a9f8c42db920" (UID: "1e02ba47-76b7-4363-8ade-a9f8c42db920"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.086348 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1e02ba47-76b7-4363-8ade-a9f8c42db920" (UID: "1e02ba47-76b7-4363-8ade-a9f8c42db920"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.086349 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e02ba47-76b7-4363-8ade-a9f8c42db920-kube-api-access-r89h4" (OuterVolumeSpecName: "kube-api-access-r89h4") pod "1e02ba47-76b7-4363-8ade-a9f8c42db920" (UID: "1e02ba47-76b7-4363-8ade-a9f8c42db920"). InnerVolumeSpecName "kube-api-access-r89h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.086433 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1e02ba47-76b7-4363-8ade-a9f8c42db920" (UID: "1e02ba47-76b7-4363-8ade-a9f8c42db920"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.183743 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r89h4\" (UniqueName: \"kubernetes.io/projected/1e02ba47-76b7-4363-8ade-a9f8c42db920-kube-api-access-r89h4\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.183774 5025 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.183785 5025 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.183795 5025 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.183804 5025 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.183813 5025 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.183822 5025 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e02ba47-76b7-4363-8ade-a9f8c42db920-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.250048 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7"] Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.663259 5025 generic.go:334] "Generic (PLEG): container finished" podID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerID="7ff46b251e87a8feaf7bb335553fffa6623a8ff039040c9f121874d96ea433f5" exitCode=0 Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.663297 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" event={"ID":"a5712d23-aa3f-49ea-b448-af44d79c9701","Type":"ContainerDied","Data":"7ff46b251e87a8feaf7bb335553fffa6623a8ff039040c9f121874d96ea433f5"} Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.664801 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" event={"ID":"a5712d23-aa3f-49ea-b448-af44d79c9701","Type":"ContainerStarted","Data":"30f199678ecdc1864fd8f5a034b5b47f57bc6b1976d3d592f0cc4e254c354a12"} Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.668111 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hmlpq_1e02ba47-76b7-4363-8ade-a9f8c42db920/console/0.log" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.668243 5025 generic.go:334] "Generic (PLEG): container finished" podID="1e02ba47-76b7-4363-8ade-a9f8c42db920" containerID="28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293" exitCode=2 Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.668286 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hmlpq" event={"ID":"1e02ba47-76b7-4363-8ade-a9f8c42db920","Type":"ContainerDied","Data":"28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293"} Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.668319 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hmlpq" event={"ID":"1e02ba47-76b7-4363-8ade-a9f8c42db920","Type":"ContainerDied","Data":"4b692ce9d525bf8211568b36403ae73b1b9255c173388434978cc8130ca4f269"} Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.668346 5025 scope.go:117] "RemoveContainer" containerID="28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.668512 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hmlpq" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.700470 5025 scope.go:117] "RemoveContainer" containerID="28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293" Oct 07 08:29:24 crc kubenswrapper[5025]: E1007 08:29:24.700977 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293\": container with ID starting with 28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293 not found: ID does not exist" containerID="28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.701019 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293"} err="failed to get container status \"28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293\": rpc error: code = NotFound desc = could not find container \"28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293\": container with ID starting with 28a36da72efb409e3db39a0bb4f1bcdc2a8cf04c11b687b17f4c9c1564618293 not found: ID does not exist" Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.757256 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hmlpq"] Oct 07 08:29:24 crc kubenswrapper[5025]: I1007 08:29:24.760987 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hmlpq"] Oct 07 08:29:25 crc kubenswrapper[5025]: I1007 08:29:25.925819 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e02ba47-76b7-4363-8ade-a9f8c42db920" path="/var/lib/kubelet/pods/1e02ba47-76b7-4363-8ade-a9f8c42db920/volumes" Oct 07 08:29:25 crc kubenswrapper[5025]: I1007 08:29:25.933932 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:29:25 crc kubenswrapper[5025]: I1007 08:29:25.934005 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:29:26 crc kubenswrapper[5025]: I1007 08:29:26.687743 5025 generic.go:334] "Generic (PLEG): container finished" podID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerID="f55d66feb383f0931081f826e223c70f1859b942f128ad90a5a497f2fd9bc738" exitCode=0 Oct 07 08:29:26 crc kubenswrapper[5025]: I1007 08:29:26.687877 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" event={"ID":"a5712d23-aa3f-49ea-b448-af44d79c9701","Type":"ContainerDied","Data":"f55d66feb383f0931081f826e223c70f1859b942f128ad90a5a497f2fd9bc738"} Oct 07 08:29:26 crc kubenswrapper[5025]: I1007 08:29:26.999781 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hp69w"] Oct 07 08:29:27 crc kubenswrapper[5025]: E1007 08:29:27.000083 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e02ba47-76b7-4363-8ade-a9f8c42db920" containerName="console" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.000102 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e02ba47-76b7-4363-8ade-a9f8c42db920" containerName="console" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.000261 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e02ba47-76b7-4363-8ade-a9f8c42db920" containerName="console" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.001424 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.010472 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hp69w"] Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.023770 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-utilities\") pod \"redhat-operators-hp69w\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.024172 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph59c\" (UniqueName: \"kubernetes.io/projected/203f4879-a9ab-4883-81cd-3d7099180cf7-kube-api-access-ph59c\") pod \"redhat-operators-hp69w\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.024228 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-catalog-content\") pod \"redhat-operators-hp69w\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.126317 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph59c\" (UniqueName: \"kubernetes.io/projected/203f4879-a9ab-4883-81cd-3d7099180cf7-kube-api-access-ph59c\") pod \"redhat-operators-hp69w\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.126485 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-catalog-content\") pod \"redhat-operators-hp69w\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.126682 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-utilities\") pod \"redhat-operators-hp69w\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.127140 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-catalog-content\") pod \"redhat-operators-hp69w\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.127224 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-utilities\") pod \"redhat-operators-hp69w\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.146640 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph59c\" (UniqueName: \"kubernetes.io/projected/203f4879-a9ab-4883-81cd-3d7099180cf7-kube-api-access-ph59c\") pod \"redhat-operators-hp69w\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.361000 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.693575 5025 generic.go:334] "Generic (PLEG): container finished" podID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerID="877bcd5d847dc61b3b34b87f826c9beaa2a211dd1af7ef4b23c80c4f09208cc6" exitCode=0 Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.693619 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" event={"ID":"a5712d23-aa3f-49ea-b448-af44d79c9701","Type":"ContainerDied","Data":"877bcd5d847dc61b3b34b87f826c9beaa2a211dd1af7ef4b23c80c4f09208cc6"} Oct 07 08:29:27 crc kubenswrapper[5025]: I1007 08:29:27.773331 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hp69w"] Oct 07 08:29:28 crc kubenswrapper[5025]: I1007 08:29:28.702153 5025 generic.go:334] "Generic (PLEG): container finished" podID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerID="29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1" exitCode=0 Oct 07 08:29:28 crc kubenswrapper[5025]: I1007 08:29:28.702250 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp69w" event={"ID":"203f4879-a9ab-4883-81cd-3d7099180cf7","Type":"ContainerDied","Data":"29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1"} Oct 07 08:29:28 crc kubenswrapper[5025]: I1007 08:29:28.702623 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp69w" event={"ID":"203f4879-a9ab-4883-81cd-3d7099180cf7","Type":"ContainerStarted","Data":"bdbf1caa9f14d5257127ee0cebf5c4ee7552f5e73f46d724c5ae8524c7332ee2"} Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.080625 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.149528 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-util\") pod \"a5712d23-aa3f-49ea-b448-af44d79c9701\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.149599 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-bundle\") pod \"a5712d23-aa3f-49ea-b448-af44d79c9701\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.149640 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78wg6\" (UniqueName: \"kubernetes.io/projected/a5712d23-aa3f-49ea-b448-af44d79c9701-kube-api-access-78wg6\") pod \"a5712d23-aa3f-49ea-b448-af44d79c9701\" (UID: \"a5712d23-aa3f-49ea-b448-af44d79c9701\") " Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.152085 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-bundle" (OuterVolumeSpecName: "bundle") pod "a5712d23-aa3f-49ea-b448-af44d79c9701" (UID: "a5712d23-aa3f-49ea-b448-af44d79c9701"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.157668 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5712d23-aa3f-49ea-b448-af44d79c9701-kube-api-access-78wg6" (OuterVolumeSpecName: "kube-api-access-78wg6") pod "a5712d23-aa3f-49ea-b448-af44d79c9701" (UID: "a5712d23-aa3f-49ea-b448-af44d79c9701"). InnerVolumeSpecName "kube-api-access-78wg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.163023 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-util" (OuterVolumeSpecName: "util") pod "a5712d23-aa3f-49ea-b448-af44d79c9701" (UID: "a5712d23-aa3f-49ea-b448-af44d79c9701"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.251132 5025 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-util\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.251187 5025 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5712d23-aa3f-49ea-b448-af44d79c9701-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.251209 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78wg6\" (UniqueName: \"kubernetes.io/projected/a5712d23-aa3f-49ea-b448-af44d79c9701-kube-api-access-78wg6\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.721015 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" event={"ID":"a5712d23-aa3f-49ea-b448-af44d79c9701","Type":"ContainerDied","Data":"30f199678ecdc1864fd8f5a034b5b47f57bc6b1976d3d592f0cc4e254c354a12"} Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.723243 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f199678ecdc1864fd8f5a034b5b47f57bc6b1976d3d592f0cc4e254c354a12" Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.721043 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7" Oct 07 08:29:29 crc kubenswrapper[5025]: I1007 08:29:29.725312 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp69w" event={"ID":"203f4879-a9ab-4883-81cd-3d7099180cf7","Type":"ContainerStarted","Data":"ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482"} Oct 07 08:29:30 crc kubenswrapper[5025]: I1007 08:29:30.731444 5025 generic.go:334] "Generic (PLEG): container finished" podID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerID="ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482" exitCode=0 Oct 07 08:29:30 crc kubenswrapper[5025]: I1007 08:29:30.731507 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp69w" event={"ID":"203f4879-a9ab-4883-81cd-3d7099180cf7","Type":"ContainerDied","Data":"ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482"} Oct 07 08:29:31 crc kubenswrapper[5025]: I1007 08:29:31.739558 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp69w" event={"ID":"203f4879-a9ab-4883-81cd-3d7099180cf7","Type":"ContainerStarted","Data":"eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96"} Oct 07 08:29:31 crc kubenswrapper[5025]: I1007 08:29:31.757621 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hp69w" podStartSLOduration=3.220160021 podStartE2EDuration="5.757605432s" podCreationTimestamp="2025-10-07 08:29:26 +0000 UTC" firstStartedPulling="2025-10-07 08:29:28.704405039 +0000 UTC m=+775.513719193" lastFinishedPulling="2025-10-07 08:29:31.24185046 +0000 UTC m=+778.051164604" observedRunningTime="2025-10-07 08:29:31.757069764 +0000 UTC m=+778.566383908" watchObservedRunningTime="2025-10-07 08:29:31.757605432 +0000 UTC m=+778.566919576" Oct 07 08:29:37 crc kubenswrapper[5025]: I1007 08:29:37.361994 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:37 crc kubenswrapper[5025]: I1007 08:29:37.362701 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:37 crc kubenswrapper[5025]: I1007 08:29:37.411234 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:37 crc kubenswrapper[5025]: I1007 08:29:37.810163 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.050905 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-668569d786-l8897"] Oct 07 08:29:38 crc kubenswrapper[5025]: E1007 08:29:38.051155 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerName="util" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.051169 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerName="util" Oct 07 08:29:38 crc kubenswrapper[5025]: E1007 08:29:38.051182 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerName="extract" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.051189 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerName="extract" Oct 07 08:29:38 crc kubenswrapper[5025]: E1007 08:29:38.051199 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerName="pull" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.051207 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerName="pull" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.051330 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5712d23-aa3f-49ea-b448-af44d79c9701" containerName="extract" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.051794 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.054178 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.054597 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.055721 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.055898 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.056160 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-8mc2d" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.059365 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95984a2b-2cba-401a-9b57-0ded2af5b4c8-apiservice-cert\") pod \"metallb-operator-controller-manager-668569d786-l8897\" (UID: \"95984a2b-2cba-401a-9b57-0ded2af5b4c8\") " pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.059608 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95984a2b-2cba-401a-9b57-0ded2af5b4c8-webhook-cert\") pod \"metallb-operator-controller-manager-668569d786-l8897\" (UID: \"95984a2b-2cba-401a-9b57-0ded2af5b4c8\") " pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.059808 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj6lp\" (UniqueName: \"kubernetes.io/projected/95984a2b-2cba-401a-9b57-0ded2af5b4c8-kube-api-access-hj6lp\") pod \"metallb-operator-controller-manager-668569d786-l8897\" (UID: \"95984a2b-2cba-401a-9b57-0ded2af5b4c8\") " pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.069068 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-668569d786-l8897"] Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.160831 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj6lp\" (UniqueName: \"kubernetes.io/projected/95984a2b-2cba-401a-9b57-0ded2af5b4c8-kube-api-access-hj6lp\") pod \"metallb-operator-controller-manager-668569d786-l8897\" (UID: \"95984a2b-2cba-401a-9b57-0ded2af5b4c8\") " pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.160918 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95984a2b-2cba-401a-9b57-0ded2af5b4c8-apiservice-cert\") pod \"metallb-operator-controller-manager-668569d786-l8897\" (UID: \"95984a2b-2cba-401a-9b57-0ded2af5b4c8\") " pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.160939 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95984a2b-2cba-401a-9b57-0ded2af5b4c8-webhook-cert\") pod \"metallb-operator-controller-manager-668569d786-l8897\" (UID: \"95984a2b-2cba-401a-9b57-0ded2af5b4c8\") " pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.168193 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95984a2b-2cba-401a-9b57-0ded2af5b4c8-apiservice-cert\") pod \"metallb-operator-controller-manager-668569d786-l8897\" (UID: \"95984a2b-2cba-401a-9b57-0ded2af5b4c8\") " pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.170573 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95984a2b-2cba-401a-9b57-0ded2af5b4c8-webhook-cert\") pod \"metallb-operator-controller-manager-668569d786-l8897\" (UID: \"95984a2b-2cba-401a-9b57-0ded2af5b4c8\") " pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.190255 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj6lp\" (UniqueName: \"kubernetes.io/projected/95984a2b-2cba-401a-9b57-0ded2af5b4c8-kube-api-access-hj6lp\") pod \"metallb-operator-controller-manager-668569d786-l8897\" (UID: \"95984a2b-2cba-401a-9b57-0ded2af5b4c8\") " pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.367507 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.387845 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg"] Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.388682 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.390685 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.390836 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.391561 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-69ptl" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.446108 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg"] Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.468691 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c884f9d5-90bb-4ff6-857f-6ffc9d973d7a-apiservice-cert\") pod \"metallb-operator-webhook-server-68dd66dd7d-xmmpg\" (UID: \"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a\") " pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.468742 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxrhv\" (UniqueName: \"kubernetes.io/projected/c884f9d5-90bb-4ff6-857f-6ffc9d973d7a-kube-api-access-sxrhv\") pod \"metallb-operator-webhook-server-68dd66dd7d-xmmpg\" (UID: \"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a\") " pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.468915 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c884f9d5-90bb-4ff6-857f-6ffc9d973d7a-webhook-cert\") pod \"metallb-operator-webhook-server-68dd66dd7d-xmmpg\" (UID: \"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a\") " pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.571027 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c884f9d5-90bb-4ff6-857f-6ffc9d973d7a-webhook-cert\") pod \"metallb-operator-webhook-server-68dd66dd7d-xmmpg\" (UID: \"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a\") " pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.571126 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c884f9d5-90bb-4ff6-857f-6ffc9d973d7a-apiservice-cert\") pod \"metallb-operator-webhook-server-68dd66dd7d-xmmpg\" (UID: \"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a\") " pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.571158 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxrhv\" (UniqueName: \"kubernetes.io/projected/c884f9d5-90bb-4ff6-857f-6ffc9d973d7a-kube-api-access-sxrhv\") pod \"metallb-operator-webhook-server-68dd66dd7d-xmmpg\" (UID: \"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a\") " pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.579222 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c884f9d5-90bb-4ff6-857f-6ffc9d973d7a-apiservice-cert\") pod \"metallb-operator-webhook-server-68dd66dd7d-xmmpg\" (UID: \"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a\") " pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.579222 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c884f9d5-90bb-4ff6-857f-6ffc9d973d7a-webhook-cert\") pod \"metallb-operator-webhook-server-68dd66dd7d-xmmpg\" (UID: \"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a\") " pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.586571 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxrhv\" (UniqueName: \"kubernetes.io/projected/c884f9d5-90bb-4ff6-857f-6ffc9d973d7a-kube-api-access-sxrhv\") pod \"metallb-operator-webhook-server-68dd66dd7d-xmmpg\" (UID: \"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a\") " pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.732206 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:38 crc kubenswrapper[5025]: I1007 08:29:38.860756 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-668569d786-l8897"] Oct 07 08:29:38 crc kubenswrapper[5025]: W1007 08:29:38.868789 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95984a2b_2cba_401a_9b57_0ded2af5b4c8.slice/crio-cc38a39742d3a0d6e1701d6ad5bde349fc8d5796ca8176d17ec9891ed47aa67c WatchSource:0}: Error finding container cc38a39742d3a0d6e1701d6ad5bde349fc8d5796ca8176d17ec9891ed47aa67c: Status 404 returned error can't find the container with id cc38a39742d3a0d6e1701d6ad5bde349fc8d5796ca8176d17ec9891ed47aa67c Oct 07 08:29:39 crc kubenswrapper[5025]: I1007 08:29:39.193260 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg"] Oct 07 08:29:39 crc kubenswrapper[5025]: W1007 08:29:39.202800 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc884f9d5_90bb_4ff6_857f_6ffc9d973d7a.slice/crio-b27797835866eef731187a204dc7a4331ba38c895a9ce5d6bdec16fa40e2889c WatchSource:0}: Error finding container b27797835866eef731187a204dc7a4331ba38c895a9ce5d6bdec16fa40e2889c: Status 404 returned error can't find the container with id b27797835866eef731187a204dc7a4331ba38c895a9ce5d6bdec16fa40e2889c Oct 07 08:29:39 crc kubenswrapper[5025]: I1007 08:29:39.778223 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" event={"ID":"95984a2b-2cba-401a-9b57-0ded2af5b4c8","Type":"ContainerStarted","Data":"cc38a39742d3a0d6e1701d6ad5bde349fc8d5796ca8176d17ec9891ed47aa67c"} Oct 07 08:29:39 crc kubenswrapper[5025]: I1007 08:29:39.781021 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" event={"ID":"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a","Type":"ContainerStarted","Data":"b27797835866eef731187a204dc7a4331ba38c895a9ce5d6bdec16fa40e2889c"} Oct 07 08:29:40 crc kubenswrapper[5025]: I1007 08:29:40.991826 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hp69w"] Oct 07 08:29:40 crc kubenswrapper[5025]: I1007 08:29:40.992032 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hp69w" podUID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerName="registry-server" containerID="cri-o://eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96" gracePeriod=2 Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.387940 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.521888 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-catalog-content\") pod \"203f4879-a9ab-4883-81cd-3d7099180cf7\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.521968 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph59c\" (UniqueName: \"kubernetes.io/projected/203f4879-a9ab-4883-81cd-3d7099180cf7-kube-api-access-ph59c\") pod \"203f4879-a9ab-4883-81cd-3d7099180cf7\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.522030 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-utilities\") pod \"203f4879-a9ab-4883-81cd-3d7099180cf7\" (UID: \"203f4879-a9ab-4883-81cd-3d7099180cf7\") " Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.523003 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-utilities" (OuterVolumeSpecName: "utilities") pod "203f4879-a9ab-4883-81cd-3d7099180cf7" (UID: "203f4879-a9ab-4883-81cd-3d7099180cf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.527280 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203f4879-a9ab-4883-81cd-3d7099180cf7-kube-api-access-ph59c" (OuterVolumeSpecName: "kube-api-access-ph59c") pod "203f4879-a9ab-4883-81cd-3d7099180cf7" (UID: "203f4879-a9ab-4883-81cd-3d7099180cf7"). InnerVolumeSpecName "kube-api-access-ph59c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.596525 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "203f4879-a9ab-4883-81cd-3d7099180cf7" (UID: "203f4879-a9ab-4883-81cd-3d7099180cf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.623835 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.623870 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203f4879-a9ab-4883-81cd-3d7099180cf7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.623881 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph59c\" (UniqueName: \"kubernetes.io/projected/203f4879-a9ab-4883-81cd-3d7099180cf7-kube-api-access-ph59c\") on node \"crc\" DevicePath \"\"" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.793210 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" event={"ID":"95984a2b-2cba-401a-9b57-0ded2af5b4c8","Type":"ContainerStarted","Data":"b694e05766fa6e9308e5f3bd28e64fc9adc2e1ab3262d6a9eb3709cf5257e8e2"} Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.793353 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.795590 5025 generic.go:334] "Generic (PLEG): container finished" podID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerID="eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96" exitCode=0 Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.795627 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp69w" event={"ID":"203f4879-a9ab-4883-81cd-3d7099180cf7","Type":"ContainerDied","Data":"eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96"} Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.795649 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp69w" event={"ID":"203f4879-a9ab-4883-81cd-3d7099180cf7","Type":"ContainerDied","Data":"bdbf1caa9f14d5257127ee0cebf5c4ee7552f5e73f46d724c5ae8524c7332ee2"} Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.795669 5025 scope.go:117] "RemoveContainer" containerID="eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.795688 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp69w" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.814936 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" podStartSLOduration=1.511922271 podStartE2EDuration="3.81491486s" podCreationTimestamp="2025-10-07 08:29:38 +0000 UTC" firstStartedPulling="2025-10-07 08:29:38.883161388 +0000 UTC m=+785.692475532" lastFinishedPulling="2025-10-07 08:29:41.186153977 +0000 UTC m=+787.995468121" observedRunningTime="2025-10-07 08:29:41.808121586 +0000 UTC m=+788.617435730" watchObservedRunningTime="2025-10-07 08:29:41.81491486 +0000 UTC m=+788.624229004" Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.836879 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hp69w"] Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.840380 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hp69w"] Oct 07 08:29:41 crc kubenswrapper[5025]: I1007 08:29:41.927887 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203f4879-a9ab-4883-81cd-3d7099180cf7" path="/var/lib/kubelet/pods/203f4879-a9ab-4883-81cd-3d7099180cf7/volumes" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.016053 5025 scope.go:117] "RemoveContainer" containerID="ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.037262 5025 scope.go:117] "RemoveContainer" containerID="29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.090179 5025 scope.go:117] "RemoveContainer" containerID="eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96" Oct 07 08:29:43 crc kubenswrapper[5025]: E1007 08:29:43.090749 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96\": container with ID starting with eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96 not found: ID does not exist" containerID="eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.090783 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96"} err="failed to get container status \"eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96\": rpc error: code = NotFound desc = could not find container \"eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96\": container with ID starting with eda7ec3c5155a3468463dc2afb53decb6f2cb515fd7f1fc3e703520567895f96 not found: ID does not exist" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.090807 5025 scope.go:117] "RemoveContainer" containerID="ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482" Oct 07 08:29:43 crc kubenswrapper[5025]: E1007 08:29:43.091134 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482\": container with ID starting with ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482 not found: ID does not exist" containerID="ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.091172 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482"} err="failed to get container status \"ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482\": rpc error: code = NotFound desc = could not find container \"ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482\": container with ID starting with ead1922516479695fa501795da6e6e17aceada14120b25eca0158d33d4007482 not found: ID does not exist" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.091199 5025 scope.go:117] "RemoveContainer" containerID="29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1" Oct 07 08:29:43 crc kubenswrapper[5025]: E1007 08:29:43.091560 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1\": container with ID starting with 29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1 not found: ID does not exist" containerID="29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.091610 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1"} err="failed to get container status \"29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1\": rpc error: code = NotFound desc = could not find container \"29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1\": container with ID starting with 29b9230e22c166eb2ee55880584c98912fb1fa541bb2f133e1e5d928bd01ace1 not found: ID does not exist" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.809464 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" event={"ID":"c884f9d5-90bb-4ff6-857f-6ffc9d973d7a","Type":"ContainerStarted","Data":"733abda846f170ad710f81597aeb21beb80442922475365745c74b47995f6445"} Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.810033 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:43 crc kubenswrapper[5025]: I1007 08:29:43.858266 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" podStartSLOduration=1.9496712550000002 podStartE2EDuration="5.858248143s" podCreationTimestamp="2025-10-07 08:29:38 +0000 UTC" firstStartedPulling="2025-10-07 08:29:39.205923888 +0000 UTC m=+786.015238072" lastFinishedPulling="2025-10-07 08:29:43.114500816 +0000 UTC m=+789.923814960" observedRunningTime="2025-10-07 08:29:43.856997604 +0000 UTC m=+790.666311748" watchObservedRunningTime="2025-10-07 08:29:43.858248143 +0000 UTC m=+790.667562287" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.602707 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nv4hf"] Oct 07 08:29:48 crc kubenswrapper[5025]: E1007 08:29:48.603693 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerName="extract-content" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.603712 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerName="extract-content" Oct 07 08:29:48 crc kubenswrapper[5025]: E1007 08:29:48.603729 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerName="extract-utilities" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.603737 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerName="extract-utilities" Oct 07 08:29:48 crc kubenswrapper[5025]: E1007 08:29:48.603750 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerName="registry-server" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.603757 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerName="registry-server" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.603994 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="203f4879-a9ab-4883-81cd-3d7099180cf7" containerName="registry-server" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.605490 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.633518 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nv4hf"] Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.728364 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-catalog-content\") pod \"certified-operators-nv4hf\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.728649 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxv7s\" (UniqueName: \"kubernetes.io/projected/4dbc6271-bcbb-41ab-958e-a06f39a52c85-kube-api-access-xxv7s\") pod \"certified-operators-nv4hf\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.728682 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-utilities\") pod \"certified-operators-nv4hf\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.830393 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-catalog-content\") pod \"certified-operators-nv4hf\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.830480 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxv7s\" (UniqueName: \"kubernetes.io/projected/4dbc6271-bcbb-41ab-958e-a06f39a52c85-kube-api-access-xxv7s\") pod \"certified-operators-nv4hf\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.830511 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-utilities\") pod \"certified-operators-nv4hf\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.831063 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-utilities\") pod \"certified-operators-nv4hf\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.831278 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-catalog-content\") pod \"certified-operators-nv4hf\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.855681 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxv7s\" (UniqueName: \"kubernetes.io/projected/4dbc6271-bcbb-41ab-958e-a06f39a52c85-kube-api-access-xxv7s\") pod \"certified-operators-nv4hf\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:48 crc kubenswrapper[5025]: I1007 08:29:48.934355 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:49 crc kubenswrapper[5025]: I1007 08:29:49.440777 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nv4hf"] Oct 07 08:29:49 crc kubenswrapper[5025]: W1007 08:29:49.455775 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dbc6271_bcbb_41ab_958e_a06f39a52c85.slice/crio-eeb0879a9219cdf9472466f00db6269e18020311c94ed857ddb048067fae5e7c WatchSource:0}: Error finding container eeb0879a9219cdf9472466f00db6269e18020311c94ed857ddb048067fae5e7c: Status 404 returned error can't find the container with id eeb0879a9219cdf9472466f00db6269e18020311c94ed857ddb048067fae5e7c Oct 07 08:29:49 crc kubenswrapper[5025]: I1007 08:29:49.843640 5025 generic.go:334] "Generic (PLEG): container finished" podID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerID="4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b" exitCode=0 Oct 07 08:29:49 crc kubenswrapper[5025]: I1007 08:29:49.843687 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nv4hf" event={"ID":"4dbc6271-bcbb-41ab-958e-a06f39a52c85","Type":"ContainerDied","Data":"4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b"} Oct 07 08:29:49 crc kubenswrapper[5025]: I1007 08:29:49.843963 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nv4hf" event={"ID":"4dbc6271-bcbb-41ab-958e-a06f39a52c85","Type":"ContainerStarted","Data":"eeb0879a9219cdf9472466f00db6269e18020311c94ed857ddb048067fae5e7c"} Oct 07 08:29:50 crc kubenswrapper[5025]: I1007 08:29:50.851433 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nv4hf" event={"ID":"4dbc6271-bcbb-41ab-958e-a06f39a52c85","Type":"ContainerStarted","Data":"a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5"} Oct 07 08:29:51 crc kubenswrapper[5025]: I1007 08:29:51.862371 5025 generic.go:334] "Generic (PLEG): container finished" podID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerID="a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5" exitCode=0 Oct 07 08:29:51 crc kubenswrapper[5025]: I1007 08:29:51.862415 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nv4hf" event={"ID":"4dbc6271-bcbb-41ab-958e-a06f39a52c85","Type":"ContainerDied","Data":"a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5"} Oct 07 08:29:52 crc kubenswrapper[5025]: I1007 08:29:52.870253 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nv4hf" event={"ID":"4dbc6271-bcbb-41ab-958e-a06f39a52c85","Type":"ContainerStarted","Data":"d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1"} Oct 07 08:29:52 crc kubenswrapper[5025]: I1007 08:29:52.903148 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nv4hf" podStartSLOduration=2.475566804 podStartE2EDuration="4.903131952s" podCreationTimestamp="2025-10-07 08:29:48 +0000 UTC" firstStartedPulling="2025-10-07 08:29:49.845111998 +0000 UTC m=+796.654426142" lastFinishedPulling="2025-10-07 08:29:52.272677146 +0000 UTC m=+799.081991290" observedRunningTime="2025-10-07 08:29:52.901888003 +0000 UTC m=+799.711202147" watchObservedRunningTime="2025-10-07 08:29:52.903131952 +0000 UTC m=+799.712446096" Oct 07 08:29:55 crc kubenswrapper[5025]: I1007 08:29:55.934435 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:29:55 crc kubenswrapper[5025]: I1007 08:29:55.934504 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:29:58 crc kubenswrapper[5025]: I1007 08:29:58.737062 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-68dd66dd7d-xmmpg" Oct 07 08:29:58 crc kubenswrapper[5025]: I1007 08:29:58.934724 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:58 crc kubenswrapper[5025]: I1007 08:29:58.934849 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:58 crc kubenswrapper[5025]: I1007 08:29:58.968051 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:29:59 crc kubenswrapper[5025]: I1007 08:29:59.947002 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.133236 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g"] Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.134141 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.136390 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.137241 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.148079 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g"] Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.285659 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82x68\" (UniqueName: \"kubernetes.io/projected/001b1491-5b33-47a0-a6ba-6d982c0df2da-kube-api-access-82x68\") pod \"collect-profiles-29330430-2f22g\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.285748 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/001b1491-5b33-47a0-a6ba-6d982c0df2da-config-volume\") pod \"collect-profiles-29330430-2f22g\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.285825 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/001b1491-5b33-47a0-a6ba-6d982c0df2da-secret-volume\") pod \"collect-profiles-29330430-2f22g\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.387361 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82x68\" (UniqueName: \"kubernetes.io/projected/001b1491-5b33-47a0-a6ba-6d982c0df2da-kube-api-access-82x68\") pod \"collect-profiles-29330430-2f22g\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.387429 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/001b1491-5b33-47a0-a6ba-6d982c0df2da-config-volume\") pod \"collect-profiles-29330430-2f22g\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.387461 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/001b1491-5b33-47a0-a6ba-6d982c0df2da-secret-volume\") pod \"collect-profiles-29330430-2f22g\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.388336 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/001b1491-5b33-47a0-a6ba-6d982c0df2da-config-volume\") pod \"collect-profiles-29330430-2f22g\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.398003 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/001b1491-5b33-47a0-a6ba-6d982c0df2da-secret-volume\") pod \"collect-profiles-29330430-2f22g\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.404830 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82x68\" (UniqueName: \"kubernetes.io/projected/001b1491-5b33-47a0-a6ba-6d982c0df2da-kube-api-access-82x68\") pod \"collect-profiles-29330430-2f22g\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.448154 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.793597 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nv4hf"] Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.864206 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g"] Oct 07 08:30:00 crc kubenswrapper[5025]: I1007 08:30:00.919221 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" event={"ID":"001b1491-5b33-47a0-a6ba-6d982c0df2da","Type":"ContainerStarted","Data":"38341b903d7ebafc1a7fbd2f1e049af0037c7a32df8d1d5ed211a850c50adeb7"} Oct 07 08:30:01 crc kubenswrapper[5025]: I1007 08:30:01.947428 5025 generic.go:334] "Generic (PLEG): container finished" podID="001b1491-5b33-47a0-a6ba-6d982c0df2da" containerID="36789b3c4f8d3fb913309244bd63d467b9dd4f3e0919cede357288051041315b" exitCode=0 Oct 07 08:30:01 crc kubenswrapper[5025]: I1007 08:30:01.947955 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nv4hf" podUID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerName="registry-server" containerID="cri-o://d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1" gracePeriod=2 Oct 07 08:30:01 crc kubenswrapper[5025]: I1007 08:30:01.948420 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" event={"ID":"001b1491-5b33-47a0-a6ba-6d982c0df2da","Type":"ContainerDied","Data":"36789b3c4f8d3fb913309244bd63d467b9dd4f3e0919cede357288051041315b"} Oct 07 08:30:02 crc kubenswrapper[5025]: I1007 08:30:02.836903 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:30:02 crc kubenswrapper[5025]: I1007 08:30:02.956217 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nv4hf" Oct 07 08:30:02 crc kubenswrapper[5025]: I1007 08:30:02.956264 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nv4hf" event={"ID":"4dbc6271-bcbb-41ab-958e-a06f39a52c85","Type":"ContainerDied","Data":"d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1"} Oct 07 08:30:02 crc kubenswrapper[5025]: I1007 08:30:02.956842 5025 scope.go:117] "RemoveContainer" containerID="d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1" Oct 07 08:30:02 crc kubenswrapper[5025]: I1007 08:30:02.956122 5025 generic.go:334] "Generic (PLEG): container finished" podID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerID="d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1" exitCode=0 Oct 07 08:30:02 crc kubenswrapper[5025]: I1007 08:30:02.957373 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nv4hf" event={"ID":"4dbc6271-bcbb-41ab-958e-a06f39a52c85","Type":"ContainerDied","Data":"eeb0879a9219cdf9472466f00db6269e18020311c94ed857ddb048067fae5e7c"} Oct 07 08:30:02 crc kubenswrapper[5025]: I1007 08:30:02.976626 5025 scope.go:117] "RemoveContainer" containerID="a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5" Oct 07 08:30:02 crc kubenswrapper[5025]: I1007 08:30:02.998811 5025 scope.go:117] "RemoveContainer" containerID="4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.021088 5025 scope.go:117] "RemoveContainer" containerID="d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1" Oct 07 08:30:03 crc kubenswrapper[5025]: E1007 08:30:03.022397 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1\": container with ID starting with d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1 not found: ID does not exist" containerID="d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.022447 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1"} err="failed to get container status \"d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1\": rpc error: code = NotFound desc = could not find container \"d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1\": container with ID starting with d025b520fb21fff91553cac15b4a82c561dc6f2a656826fcb71db02f163ed5f1 not found: ID does not exist" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.022477 5025 scope.go:117] "RemoveContainer" containerID="a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5" Oct 07 08:30:03 crc kubenswrapper[5025]: E1007 08:30:03.022866 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5\": container with ID starting with a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5 not found: ID does not exist" containerID="a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.022889 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5"} err="failed to get container status \"a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5\": rpc error: code = NotFound desc = could not find container \"a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5\": container with ID starting with a5950fe8886f6b6189783319296d4306f8deab3cd14ecff1a53cb0558d8c76e5 not found: ID does not exist" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.022904 5025 scope.go:117] "RemoveContainer" containerID="4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b" Oct 07 08:30:03 crc kubenswrapper[5025]: E1007 08:30:03.023089 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b\": container with ID starting with 4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b not found: ID does not exist" containerID="4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.023107 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b"} err="failed to get container status \"4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b\": rpc error: code = NotFound desc = could not find container \"4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b\": container with ID starting with 4bf816b1a85ea461633ac3a27a2b3477031c038dccb021724c6b682d06cf6e5b not found: ID does not exist" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.031065 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-catalog-content\") pod \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.031147 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxv7s\" (UniqueName: \"kubernetes.io/projected/4dbc6271-bcbb-41ab-958e-a06f39a52c85-kube-api-access-xxv7s\") pod \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.031221 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-utilities\") pod \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\" (UID: \"4dbc6271-bcbb-41ab-958e-a06f39a52c85\") " Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.033308 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-utilities" (OuterVolumeSpecName: "utilities") pod "4dbc6271-bcbb-41ab-958e-a06f39a52c85" (UID: "4dbc6271-bcbb-41ab-958e-a06f39a52c85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.039496 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbc6271-bcbb-41ab-958e-a06f39a52c85-kube-api-access-xxv7s" (OuterVolumeSpecName: "kube-api-access-xxv7s") pod "4dbc6271-bcbb-41ab-958e-a06f39a52c85" (UID: "4dbc6271-bcbb-41ab-958e-a06f39a52c85"). InnerVolumeSpecName "kube-api-access-xxv7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.081565 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4dbc6271-bcbb-41ab-958e-a06f39a52c85" (UID: "4dbc6271-bcbb-41ab-958e-a06f39a52c85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.132776 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.132808 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxv7s\" (UniqueName: \"kubernetes.io/projected/4dbc6271-bcbb-41ab-958e-a06f39a52c85-kube-api-access-xxv7s\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.132819 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbc6271-bcbb-41ab-958e-a06f39a52c85-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.167565 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.286605 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nv4hf"] Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.289490 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nv4hf"] Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.335596 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/001b1491-5b33-47a0-a6ba-6d982c0df2da-config-volume\") pod \"001b1491-5b33-47a0-a6ba-6d982c0df2da\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.335666 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/001b1491-5b33-47a0-a6ba-6d982c0df2da-secret-volume\") pod \"001b1491-5b33-47a0-a6ba-6d982c0df2da\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.335739 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82x68\" (UniqueName: \"kubernetes.io/projected/001b1491-5b33-47a0-a6ba-6d982c0df2da-kube-api-access-82x68\") pod \"001b1491-5b33-47a0-a6ba-6d982c0df2da\" (UID: \"001b1491-5b33-47a0-a6ba-6d982c0df2da\") " Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.336025 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001b1491-5b33-47a0-a6ba-6d982c0df2da-config-volume" (OuterVolumeSpecName: "config-volume") pod "001b1491-5b33-47a0-a6ba-6d982c0df2da" (UID: "001b1491-5b33-47a0-a6ba-6d982c0df2da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.339789 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/001b1491-5b33-47a0-a6ba-6d982c0df2da-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "001b1491-5b33-47a0-a6ba-6d982c0df2da" (UID: "001b1491-5b33-47a0-a6ba-6d982c0df2da"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.341241 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001b1491-5b33-47a0-a6ba-6d982c0df2da-kube-api-access-82x68" (OuterVolumeSpecName: "kube-api-access-82x68") pod "001b1491-5b33-47a0-a6ba-6d982c0df2da" (UID: "001b1491-5b33-47a0-a6ba-6d982c0df2da"). InnerVolumeSpecName "kube-api-access-82x68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.437755 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82x68\" (UniqueName: \"kubernetes.io/projected/001b1491-5b33-47a0-a6ba-6d982c0df2da-kube-api-access-82x68\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.437803 5025 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/001b1491-5b33-47a0-a6ba-6d982c0df2da-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.437824 5025 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/001b1491-5b33-47a0-a6ba-6d982c0df2da-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.924697 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" path="/var/lib/kubelet/pods/4dbc6271-bcbb-41ab-958e-a06f39a52c85/volumes" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.965643 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" event={"ID":"001b1491-5b33-47a0-a6ba-6d982c0df2da","Type":"ContainerDied","Data":"38341b903d7ebafc1a7fbd2f1e049af0037c7a32df8d1d5ed211a850c50adeb7"} Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.965686 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38341b903d7ebafc1a7fbd2f1e049af0037c7a32df8d1d5ed211a850c50adeb7" Oct 07 08:30:03 crc kubenswrapper[5025]: I1007 08:30:03.965711 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.414403 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pk9z4"] Oct 07 08:30:06 crc kubenswrapper[5025]: E1007 08:30:06.415354 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001b1491-5b33-47a0-a6ba-6d982c0df2da" containerName="collect-profiles" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.415378 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="001b1491-5b33-47a0-a6ba-6d982c0df2da" containerName="collect-profiles" Oct 07 08:30:06 crc kubenswrapper[5025]: E1007 08:30:06.415418 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerName="extract-utilities" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.415430 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerName="extract-utilities" Oct 07 08:30:06 crc kubenswrapper[5025]: E1007 08:30:06.415446 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerName="extract-content" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.415459 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerName="extract-content" Oct 07 08:30:06 crc kubenswrapper[5025]: E1007 08:30:06.415475 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerName="registry-server" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.415487 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerName="registry-server" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.415695 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbc6271-bcbb-41ab-958e-a06f39a52c85" containerName="registry-server" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.415714 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="001b1491-5b33-47a0-a6ba-6d982c0df2da" containerName="collect-profiles" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.417235 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.432401 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk9z4"] Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.484111 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-catalog-content\") pod \"redhat-marketplace-pk9z4\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.484434 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-utilities\") pod \"redhat-marketplace-pk9z4\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.484520 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfm7c\" (UniqueName: \"kubernetes.io/projected/73118700-850f-4703-aca2-bc3b208b6f46-kube-api-access-kfm7c\") pod \"redhat-marketplace-pk9z4\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.586596 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-utilities\") pod \"redhat-marketplace-pk9z4\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.586664 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfm7c\" (UniqueName: \"kubernetes.io/projected/73118700-850f-4703-aca2-bc3b208b6f46-kube-api-access-kfm7c\") pod \"redhat-marketplace-pk9z4\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.586692 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-catalog-content\") pod \"redhat-marketplace-pk9z4\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.587121 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-catalog-content\") pod \"redhat-marketplace-pk9z4\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.587252 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-utilities\") pod \"redhat-marketplace-pk9z4\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.611810 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfm7c\" (UniqueName: \"kubernetes.io/projected/73118700-850f-4703-aca2-bc3b208b6f46-kube-api-access-kfm7c\") pod \"redhat-marketplace-pk9z4\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.774262 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:06 crc kubenswrapper[5025]: I1007 08:30:06.975409 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk9z4"] Oct 07 08:30:07 crc kubenswrapper[5025]: I1007 08:30:07.989693 5025 generic.go:334] "Generic (PLEG): container finished" podID="73118700-850f-4703-aca2-bc3b208b6f46" containerID="050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc" exitCode=0 Oct 07 08:30:07 crc kubenswrapper[5025]: I1007 08:30:07.990014 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk9z4" event={"ID":"73118700-850f-4703-aca2-bc3b208b6f46","Type":"ContainerDied","Data":"050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc"} Oct 07 08:30:07 crc kubenswrapper[5025]: I1007 08:30:07.990045 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk9z4" event={"ID":"73118700-850f-4703-aca2-bc3b208b6f46","Type":"ContainerStarted","Data":"74d4aeb870f643f1f5683edb7c18b1fdb5e4304fe75cc40f64c0b39c2c14c9b2"} Oct 07 08:30:10 crc kubenswrapper[5025]: I1007 08:30:10.007925 5025 generic.go:334] "Generic (PLEG): container finished" podID="73118700-850f-4703-aca2-bc3b208b6f46" containerID="4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323" exitCode=0 Oct 07 08:30:10 crc kubenswrapper[5025]: I1007 08:30:10.007990 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk9z4" event={"ID":"73118700-850f-4703-aca2-bc3b208b6f46","Type":"ContainerDied","Data":"4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323"} Oct 07 08:30:11 crc kubenswrapper[5025]: I1007 08:30:11.015425 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk9z4" event={"ID":"73118700-850f-4703-aca2-bc3b208b6f46","Type":"ContainerStarted","Data":"4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493"} Oct 07 08:30:11 crc kubenswrapper[5025]: I1007 08:30:11.034362 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pk9z4" podStartSLOduration=2.513431132 podStartE2EDuration="5.034341128s" podCreationTimestamp="2025-10-07 08:30:06 +0000 UTC" firstStartedPulling="2025-10-07 08:30:07.991369961 +0000 UTC m=+814.800684115" lastFinishedPulling="2025-10-07 08:30:10.512279967 +0000 UTC m=+817.321594111" observedRunningTime="2025-10-07 08:30:11.031830628 +0000 UTC m=+817.841144802" watchObservedRunningTime="2025-10-07 08:30:11.034341128 +0000 UTC m=+817.843655262" Oct 07 08:30:16 crc kubenswrapper[5025]: I1007 08:30:16.774655 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:16 crc kubenswrapper[5025]: I1007 08:30:16.775211 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:16 crc kubenswrapper[5025]: I1007 08:30:16.834517 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:17 crc kubenswrapper[5025]: I1007 08:30:17.094659 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:17 crc kubenswrapper[5025]: I1007 08:30:17.136246 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk9z4"] Oct 07 08:30:18 crc kubenswrapper[5025]: I1007 08:30:18.371974 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-668569d786-l8897" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.044650 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp"] Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.045929 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.047411 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gg6pw" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.047587 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.047956 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-6z6lt"] Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.050278 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.053115 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.053251 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.061554 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp"] Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.066138 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pk9z4" podUID="73118700-850f-4703-aca2-bc3b208b6f46" containerName="registry-server" containerID="cri-o://4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493" gracePeriod=2 Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.076091 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6dkq\" (UniqueName: \"kubernetes.io/projected/027820a1-f099-4452-960d-b9d33d3eb48f-kube-api-access-p6dkq\") pod \"frr-k8s-webhook-server-64bf5d555-48bhp\" (UID: \"027820a1-f099-4452-960d-b9d33d3eb48f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.076412 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/027820a1-f099-4452-960d-b9d33d3eb48f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-48bhp\" (UID: \"027820a1-f099-4452-960d-b9d33d3eb48f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.076524 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-metrics-certs\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.076627 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2cjh\" (UniqueName: \"kubernetes.io/projected/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-kube-api-access-v2cjh\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.076727 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-frr-sockets\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.076836 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-reloader\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.076919 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-frr-conf\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.076996 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-metrics\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.077110 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-frr-startup\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.137520 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mmddh"] Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.138590 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.141083 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.141308 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.141607 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.141642 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4pz4m" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.154227 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-7h8g2"] Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.155104 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.156852 5025 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.170624 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-7h8g2"] Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.185716 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6dkq\" (UniqueName: \"kubernetes.io/projected/027820a1-f099-4452-960d-b9d33d3eb48f-kube-api-access-p6dkq\") pod \"frr-k8s-webhook-server-64bf5d555-48bhp\" (UID: \"027820a1-f099-4452-960d-b9d33d3eb48f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.185774 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/027820a1-f099-4452-960d-b9d33d3eb48f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-48bhp\" (UID: \"027820a1-f099-4452-960d-b9d33d3eb48f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.185811 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-metrics-certs\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.185837 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2cjh\" (UniqueName: \"kubernetes.io/projected/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-kube-api-access-v2cjh\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.185875 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-frr-sockets\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.185894 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-reloader\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.185920 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-frr-conf\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.185939 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-metrics\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.185981 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-frr-startup\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: E1007 08:30:19.186172 5025 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 07 08:30:19 crc kubenswrapper[5025]: E1007 08:30:19.186221 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027820a1-f099-4452-960d-b9d33d3eb48f-cert podName:027820a1-f099-4452-960d-b9d33d3eb48f nodeName:}" failed. No retries permitted until 2025-10-07 08:30:19.686202821 +0000 UTC m=+826.495516965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/027820a1-f099-4452-960d-b9d33d3eb48f-cert") pod "frr-k8s-webhook-server-64bf5d555-48bhp" (UID: "027820a1-f099-4452-960d-b9d33d3eb48f") : secret "frr-k8s-webhook-server-cert" not found Oct 07 08:30:19 crc kubenswrapper[5025]: E1007 08:30:19.186347 5025 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 07 08:30:19 crc kubenswrapper[5025]: E1007 08:30:19.186426 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-metrics-certs podName:5a2e702e-d565-4f63-a7cf-21465cf8d4fa nodeName:}" failed. No retries permitted until 2025-10-07 08:30:19.686402308 +0000 UTC m=+826.495716522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-metrics-certs") pod "frr-k8s-6z6lt" (UID: "5a2e702e-d565-4f63-a7cf-21465cf8d4fa") : secret "frr-k8s-certs-secret" not found Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.186677 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-reloader\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.186949 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-frr-startup\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.187016 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-metrics\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.187190 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-frr-sockets\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.187840 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-frr-conf\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.206739 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2cjh\" (UniqueName: \"kubernetes.io/projected/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-kube-api-access-v2cjh\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.216410 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6dkq\" (UniqueName: \"kubernetes.io/projected/027820a1-f099-4452-960d-b9d33d3eb48f-kube-api-access-p6dkq\") pod \"frr-k8s-webhook-server-64bf5d555-48bhp\" (UID: \"027820a1-f099-4452-960d-b9d33d3eb48f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.287344 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxrjx\" (UniqueName: \"kubernetes.io/projected/28a4236b-41fc-4aba-8592-0b055eff1685-kube-api-access-lxrjx\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.287487 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/28a4236b-41fc-4aba-8592-0b055eff1685-metallb-excludel2\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.287871 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a67364e-1ca0-4727-8650-1d1fdcfd0259-cert\") pod \"controller-68d546b9d8-7h8g2\" (UID: \"2a67364e-1ca0-4727-8650-1d1fdcfd0259\") " pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.288194 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-metrics-certs\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.288249 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-memberlist\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.288484 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a67364e-1ca0-4727-8650-1d1fdcfd0259-metrics-certs\") pod \"controller-68d546b9d8-7h8g2\" (UID: \"2a67364e-1ca0-4727-8650-1d1fdcfd0259\") " pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.288523 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n94fn\" (UniqueName: \"kubernetes.io/projected/2a67364e-1ca0-4727-8650-1d1fdcfd0259-kube-api-access-n94fn\") pod \"controller-68d546b9d8-7h8g2\" (UID: \"2a67364e-1ca0-4727-8650-1d1fdcfd0259\") " pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.389857 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxrjx\" (UniqueName: \"kubernetes.io/projected/28a4236b-41fc-4aba-8592-0b055eff1685-kube-api-access-lxrjx\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.390150 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/28a4236b-41fc-4aba-8592-0b055eff1685-metallb-excludel2\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.390173 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a67364e-1ca0-4727-8650-1d1fdcfd0259-cert\") pod \"controller-68d546b9d8-7h8g2\" (UID: \"2a67364e-1ca0-4727-8650-1d1fdcfd0259\") " pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.390195 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-metrics-certs\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.390227 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-memberlist\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.390254 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a67364e-1ca0-4727-8650-1d1fdcfd0259-metrics-certs\") pod \"controller-68d546b9d8-7h8g2\" (UID: \"2a67364e-1ca0-4727-8650-1d1fdcfd0259\") " pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.390276 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n94fn\" (UniqueName: \"kubernetes.io/projected/2a67364e-1ca0-4727-8650-1d1fdcfd0259-kube-api-access-n94fn\") pod \"controller-68d546b9d8-7h8g2\" (UID: \"2a67364e-1ca0-4727-8650-1d1fdcfd0259\") " pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.390744 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/28a4236b-41fc-4aba-8592-0b055eff1685-metallb-excludel2\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: E1007 08:30:19.390807 5025 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 08:30:19 crc kubenswrapper[5025]: E1007 08:30:19.390842 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-memberlist podName:28a4236b-41fc-4aba-8592-0b055eff1685 nodeName:}" failed. No retries permitted until 2025-10-07 08:30:19.890830139 +0000 UTC m=+826.700144273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-memberlist") pod "speaker-mmddh" (UID: "28a4236b-41fc-4aba-8592-0b055eff1685") : secret "metallb-memberlist" not found Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.397915 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-metrics-certs\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.397992 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a67364e-1ca0-4727-8650-1d1fdcfd0259-cert\") pod \"controller-68d546b9d8-7h8g2\" (UID: \"2a67364e-1ca0-4727-8650-1d1fdcfd0259\") " pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.398034 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a67364e-1ca0-4727-8650-1d1fdcfd0259-metrics-certs\") pod \"controller-68d546b9d8-7h8g2\" (UID: \"2a67364e-1ca0-4727-8650-1d1fdcfd0259\") " pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.411191 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n94fn\" (UniqueName: \"kubernetes.io/projected/2a67364e-1ca0-4727-8650-1d1fdcfd0259-kube-api-access-n94fn\") pod \"controller-68d546b9d8-7h8g2\" (UID: \"2a67364e-1ca0-4727-8650-1d1fdcfd0259\") " pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.415121 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxrjx\" (UniqueName: \"kubernetes.io/projected/28a4236b-41fc-4aba-8592-0b055eff1685-kube-api-access-lxrjx\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.473103 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.557152 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.592392 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-utilities\") pod \"73118700-850f-4703-aca2-bc3b208b6f46\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.592490 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-catalog-content\") pod \"73118700-850f-4703-aca2-bc3b208b6f46\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.592592 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfm7c\" (UniqueName: \"kubernetes.io/projected/73118700-850f-4703-aca2-bc3b208b6f46-kube-api-access-kfm7c\") pod \"73118700-850f-4703-aca2-bc3b208b6f46\" (UID: \"73118700-850f-4703-aca2-bc3b208b6f46\") " Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.593333 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-utilities" (OuterVolumeSpecName: "utilities") pod "73118700-850f-4703-aca2-bc3b208b6f46" (UID: "73118700-850f-4703-aca2-bc3b208b6f46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.596680 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73118700-850f-4703-aca2-bc3b208b6f46-kube-api-access-kfm7c" (OuterVolumeSpecName: "kube-api-access-kfm7c") pod "73118700-850f-4703-aca2-bc3b208b6f46" (UID: "73118700-850f-4703-aca2-bc3b208b6f46"). InnerVolumeSpecName "kube-api-access-kfm7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.608117 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73118700-850f-4703-aca2-bc3b208b6f46" (UID: "73118700-850f-4703-aca2-bc3b208b6f46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.693862 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/027820a1-f099-4452-960d-b9d33d3eb48f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-48bhp\" (UID: \"027820a1-f099-4452-960d-b9d33d3eb48f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.693941 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-metrics-certs\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.694002 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.694013 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73118700-850f-4703-aca2-bc3b208b6f46-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.694024 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfm7c\" (UniqueName: \"kubernetes.io/projected/73118700-850f-4703-aca2-bc3b208b6f46-kube-api-access-kfm7c\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.698108 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/027820a1-f099-4452-960d-b9d33d3eb48f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-48bhp\" (UID: \"027820a1-f099-4452-960d-b9d33d3eb48f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.699796 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a2e702e-d565-4f63-a7cf-21465cf8d4fa-metrics-certs\") pod \"frr-k8s-6z6lt\" (UID: \"5a2e702e-d565-4f63-a7cf-21465cf8d4fa\") " pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.897244 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-memberlist\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:19 crc kubenswrapper[5025]: E1007 08:30:19.897441 5025 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 08:30:19 crc kubenswrapper[5025]: E1007 08:30:19.897530 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-memberlist podName:28a4236b-41fc-4aba-8592-0b055eff1685 nodeName:}" failed. No retries permitted until 2025-10-07 08:30:20.897506293 +0000 UTC m=+827.706820437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-memberlist") pod "speaker-mmddh" (UID: "28a4236b-41fc-4aba-8592-0b055eff1685") : secret "metallb-memberlist" not found Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.936662 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-7h8g2"] Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.973268 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:19 crc kubenswrapper[5025]: I1007 08:30:19.995363 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.070180 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7h8g2" event={"ID":"2a67364e-1ca0-4727-8650-1d1fdcfd0259","Type":"ContainerStarted","Data":"1e720cdc7fe2a48891157101f331c6346b9d194e6376738187f203b3a5422e24"} Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.072126 5025 generic.go:334] "Generic (PLEG): container finished" podID="73118700-850f-4703-aca2-bc3b208b6f46" containerID="4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493" exitCode=0 Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.072153 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk9z4" event={"ID":"73118700-850f-4703-aca2-bc3b208b6f46","Type":"ContainerDied","Data":"4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493"} Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.072174 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk9z4" event={"ID":"73118700-850f-4703-aca2-bc3b208b6f46","Type":"ContainerDied","Data":"74d4aeb870f643f1f5683edb7c18b1fdb5e4304fe75cc40f64c0b39c2c14c9b2"} Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.072194 5025 scope.go:117] "RemoveContainer" containerID="4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.072302 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pk9z4" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.125786 5025 scope.go:117] "RemoveContainer" containerID="4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.127002 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk9z4"] Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.130173 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk9z4"] Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.146852 5025 scope.go:117] "RemoveContainer" containerID="050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.169955 5025 scope.go:117] "RemoveContainer" containerID="4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493" Oct 07 08:30:20 crc kubenswrapper[5025]: E1007 08:30:20.170435 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493\": container with ID starting with 4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493 not found: ID does not exist" containerID="4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.170474 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493"} err="failed to get container status \"4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493\": rpc error: code = NotFound desc = could not find container \"4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493\": container with ID starting with 4968ca38d8b02e0567a1a17c0a348708dfcced17f8c663fee7c687337d58e493 not found: ID does not exist" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.170498 5025 scope.go:117] "RemoveContainer" containerID="4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323" Oct 07 08:30:20 crc kubenswrapper[5025]: E1007 08:30:20.171714 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323\": container with ID starting with 4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323 not found: ID does not exist" containerID="4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.171912 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323"} err="failed to get container status \"4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323\": rpc error: code = NotFound desc = could not find container \"4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323\": container with ID starting with 4c0b77ce4f33311505c8f55ccf8a38e8c88f1b15b29f29dde3deafbe66822323 not found: ID does not exist" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.171928 5025 scope.go:117] "RemoveContainer" containerID="050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc" Oct 07 08:30:20 crc kubenswrapper[5025]: E1007 08:30:20.172634 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc\": container with ID starting with 050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc not found: ID does not exist" containerID="050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.172662 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc"} err="failed to get container status \"050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc\": rpc error: code = NotFound desc = could not find container \"050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc\": container with ID starting with 050ffdbf11a5b7ea5676352eeba318b3f0b6999a133c04346d71d4270ee947dc not found: ID does not exist" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.272162 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-89wwr"] Oct 07 08:30:20 crc kubenswrapper[5025]: E1007 08:30:20.272377 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73118700-850f-4703-aca2-bc3b208b6f46" containerName="extract-content" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.272391 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="73118700-850f-4703-aca2-bc3b208b6f46" containerName="extract-content" Oct 07 08:30:20 crc kubenswrapper[5025]: E1007 08:30:20.272407 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73118700-850f-4703-aca2-bc3b208b6f46" containerName="extract-utilities" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.272414 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="73118700-850f-4703-aca2-bc3b208b6f46" containerName="extract-utilities" Oct 07 08:30:20 crc kubenswrapper[5025]: E1007 08:30:20.272424 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73118700-850f-4703-aca2-bc3b208b6f46" containerName="registry-server" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.272430 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="73118700-850f-4703-aca2-bc3b208b6f46" containerName="registry-server" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.272520 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="73118700-850f-4703-aca2-bc3b208b6f46" containerName="registry-server" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.273199 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.283215 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89wwr"] Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.403045 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-catalog-content\") pod \"community-operators-89wwr\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.403127 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slzkm\" (UniqueName: \"kubernetes.io/projected/ccb408bb-4a59-4fc2-8060-e535c53bcc70-kube-api-access-slzkm\") pod \"community-operators-89wwr\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.403166 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-utilities\") pod \"community-operators-89wwr\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.407435 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp"] Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.504414 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-utilities\") pod \"community-operators-89wwr\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.504495 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-catalog-content\") pod \"community-operators-89wwr\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.504587 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slzkm\" (UniqueName: \"kubernetes.io/projected/ccb408bb-4a59-4fc2-8060-e535c53bcc70-kube-api-access-slzkm\") pod \"community-operators-89wwr\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.504876 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-utilities\") pod \"community-operators-89wwr\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.505063 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-catalog-content\") pod \"community-operators-89wwr\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.524314 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slzkm\" (UniqueName: \"kubernetes.io/projected/ccb408bb-4a59-4fc2-8060-e535c53bcc70-kube-api-access-slzkm\") pod \"community-operators-89wwr\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.596087 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.909327 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-memberlist\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:20 crc kubenswrapper[5025]: I1007 08:30:20.914533 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/28a4236b-41fc-4aba-8592-0b055eff1685-memberlist\") pod \"speaker-mmddh\" (UID: \"28a4236b-41fc-4aba-8592-0b055eff1685\") " pod="metallb-system/speaker-mmddh" Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.038553 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89wwr"] Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.046216 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mmddh" Oct 07 08:30:21 crc kubenswrapper[5025]: W1007 08:30:21.075110 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a4236b_41fc_4aba_8592_0b055eff1685.slice/crio-d55bd93785438c155ee22cd766ca6bd3fac111be6937cb64b65e4f75c51725fd WatchSource:0}: Error finding container d55bd93785438c155ee22cd766ca6bd3fac111be6937cb64b65e4f75c51725fd: Status 404 returned error can't find the container with id d55bd93785438c155ee22cd766ca6bd3fac111be6937cb64b65e4f75c51725fd Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.085165 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89wwr" event={"ID":"ccb408bb-4a59-4fc2-8060-e535c53bcc70","Type":"ContainerStarted","Data":"8e5695a7997465a8b7a8a9f6767021d05fe95f8659411077f9a3626fd77911ee"} Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.086320 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" event={"ID":"027820a1-f099-4452-960d-b9d33d3eb48f","Type":"ContainerStarted","Data":"e0b51f5729bdcdd45b0993b57f8dd2192c03bb9d4a63d6be141858c009e9f8ed"} Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.090076 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7h8g2" event={"ID":"2a67364e-1ca0-4727-8650-1d1fdcfd0259","Type":"ContainerStarted","Data":"086b4ec85eec6ca71014a830a34a8e1aa42b0a4af747d387672f478062f89436"} Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.090138 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7h8g2" event={"ID":"2a67364e-1ca0-4727-8650-1d1fdcfd0259","Type":"ContainerStarted","Data":"b082f07ad9d65d233b35ab24a703e7647422fb9df70a7e78153b7c88f146a1d4"} Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.090196 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.092116 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerStarted","Data":"4bd8cd3a4821006c510050872412d0f76cac39f74f6ffab5c57014a3cbbd28c5"} Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.113018 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-7h8g2" podStartSLOduration=2.112994151 podStartE2EDuration="2.112994151s" podCreationTimestamp="2025-10-07 08:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:30:21.109337795 +0000 UTC m=+827.918651959" watchObservedRunningTime="2025-10-07 08:30:21.112994151 +0000 UTC m=+827.922308315" Oct 07 08:30:21 crc kubenswrapper[5025]: I1007 08:30:21.925489 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73118700-850f-4703-aca2-bc3b208b6f46" path="/var/lib/kubelet/pods/73118700-850f-4703-aca2-bc3b208b6f46/volumes" Oct 07 08:30:22 crc kubenswrapper[5025]: I1007 08:30:22.108422 5025 generic.go:334] "Generic (PLEG): container finished" podID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerID="1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446" exitCode=0 Oct 07 08:30:22 crc kubenswrapper[5025]: I1007 08:30:22.108496 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89wwr" event={"ID":"ccb408bb-4a59-4fc2-8060-e535c53bcc70","Type":"ContainerDied","Data":"1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446"} Oct 07 08:30:22 crc kubenswrapper[5025]: I1007 08:30:22.114658 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mmddh" event={"ID":"28a4236b-41fc-4aba-8592-0b055eff1685","Type":"ContainerStarted","Data":"642616d9b03c88aa2842bc4aa979f3c29b4a75f5725b10f3888f2aa846a3463f"} Oct 07 08:30:22 crc kubenswrapper[5025]: I1007 08:30:22.114733 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mmddh" event={"ID":"28a4236b-41fc-4aba-8592-0b055eff1685","Type":"ContainerStarted","Data":"39e9761f9d124770d552395b2bdf1d83c0e16ef20a8e5b0cf3aa6e32b25a7921"} Oct 07 08:30:22 crc kubenswrapper[5025]: I1007 08:30:22.114746 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mmddh" event={"ID":"28a4236b-41fc-4aba-8592-0b055eff1685","Type":"ContainerStarted","Data":"d55bd93785438c155ee22cd766ca6bd3fac111be6937cb64b65e4f75c51725fd"} Oct 07 08:30:22 crc kubenswrapper[5025]: I1007 08:30:22.114986 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mmddh" Oct 07 08:30:22 crc kubenswrapper[5025]: I1007 08:30:22.151306 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mmddh" podStartSLOduration=3.151290238 podStartE2EDuration="3.151290238s" podCreationTimestamp="2025-10-07 08:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:30:22.146464606 +0000 UTC m=+828.955778750" watchObservedRunningTime="2025-10-07 08:30:22.151290238 +0000 UTC m=+828.960604382" Oct 07 08:30:23 crc kubenswrapper[5025]: I1007 08:30:23.123606 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89wwr" event={"ID":"ccb408bb-4a59-4fc2-8060-e535c53bcc70","Type":"ContainerStarted","Data":"0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f"} Oct 07 08:30:24 crc kubenswrapper[5025]: I1007 08:30:24.132271 5025 generic.go:334] "Generic (PLEG): container finished" podID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerID="0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f" exitCode=0 Oct 07 08:30:24 crc kubenswrapper[5025]: I1007 08:30:24.132322 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89wwr" event={"ID":"ccb408bb-4a59-4fc2-8060-e535c53bcc70","Type":"ContainerDied","Data":"0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f"} Oct 07 08:30:25 crc kubenswrapper[5025]: I1007 08:30:25.144959 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89wwr" event={"ID":"ccb408bb-4a59-4fc2-8060-e535c53bcc70","Type":"ContainerStarted","Data":"36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42"} Oct 07 08:30:25 crc kubenswrapper[5025]: I1007 08:30:25.167964 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-89wwr" podStartSLOduration=2.720647423 podStartE2EDuration="5.167947185s" podCreationTimestamp="2025-10-07 08:30:20 +0000 UTC" firstStartedPulling="2025-10-07 08:30:22.110138397 +0000 UTC m=+828.919452541" lastFinishedPulling="2025-10-07 08:30:24.557438159 +0000 UTC m=+831.366752303" observedRunningTime="2025-10-07 08:30:25.163770813 +0000 UTC m=+831.973084987" watchObservedRunningTime="2025-10-07 08:30:25.167947185 +0000 UTC m=+831.977261349" Oct 07 08:30:25 crc kubenswrapper[5025]: I1007 08:30:25.934478 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:30:25 crc kubenswrapper[5025]: I1007 08:30:25.934565 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:30:25 crc kubenswrapper[5025]: I1007 08:30:25.934613 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:30:25 crc kubenswrapper[5025]: I1007 08:30:25.935242 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e73f6c9680aced0c67b362a72f57135c1bc51cd7c5425767d6e7d7481054ac9b"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:30:25 crc kubenswrapper[5025]: I1007 08:30:25.935309 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://e73f6c9680aced0c67b362a72f57135c1bc51cd7c5425767d6e7d7481054ac9b" gracePeriod=600 Oct 07 08:30:26 crc kubenswrapper[5025]: I1007 08:30:26.151698 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="e73f6c9680aced0c67b362a72f57135c1bc51cd7c5425767d6e7d7481054ac9b" exitCode=0 Oct 07 08:30:26 crc kubenswrapper[5025]: I1007 08:30:26.152386 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"e73f6c9680aced0c67b362a72f57135c1bc51cd7c5425767d6e7d7481054ac9b"} Oct 07 08:30:26 crc kubenswrapper[5025]: I1007 08:30:26.152416 5025 scope.go:117] "RemoveContainer" containerID="4bbebcd2746f0ec67146ae8e43e7aa013e115cc6aa41a0609e38fed295949dae" Oct 07 08:30:28 crc kubenswrapper[5025]: I1007 08:30:28.171013 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"c91f90218018dc6579e30a1d0a59af84fe0152b7bc9f23fa2787bc6ffa336d31"} Oct 07 08:30:28 crc kubenswrapper[5025]: I1007 08:30:28.173349 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" event={"ID":"027820a1-f099-4452-960d-b9d33d3eb48f","Type":"ContainerStarted","Data":"308a75972ebc6c869ea232405e073069fcfebcab3b2b9d8950987a1ed6e8781a"} Oct 07 08:30:28 crc kubenswrapper[5025]: I1007 08:30:28.173580 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:28 crc kubenswrapper[5025]: I1007 08:30:28.175469 5025 generic.go:334] "Generic (PLEG): container finished" podID="5a2e702e-d565-4f63-a7cf-21465cf8d4fa" containerID="c5abdc599f8d455a90e63bcc9bc23bea93e36afdb8da3b88c9c2b5847d2e9d9d" exitCode=0 Oct 07 08:30:28 crc kubenswrapper[5025]: I1007 08:30:28.175516 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerDied","Data":"c5abdc599f8d455a90e63bcc9bc23bea93e36afdb8da3b88c9c2b5847d2e9d9d"} Oct 07 08:30:28 crc kubenswrapper[5025]: I1007 08:30:28.237753 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" podStartSLOduration=2.49609398 podStartE2EDuration="9.237735181s" podCreationTimestamp="2025-10-07 08:30:19 +0000 UTC" firstStartedPulling="2025-10-07 08:30:20.425506002 +0000 UTC m=+827.234820146" lastFinishedPulling="2025-10-07 08:30:27.167147203 +0000 UTC m=+833.976461347" observedRunningTime="2025-10-07 08:30:28.235953275 +0000 UTC m=+835.045267419" watchObservedRunningTime="2025-10-07 08:30:28.237735181 +0000 UTC m=+835.047049325" Oct 07 08:30:29 crc kubenswrapper[5025]: I1007 08:30:29.182663 5025 generic.go:334] "Generic (PLEG): container finished" podID="5a2e702e-d565-4f63-a7cf-21465cf8d4fa" containerID="5646f823285d74fdfff03f0a683c605a66760c8387b7007a95e2cbc423e85307" exitCode=0 Oct 07 08:30:29 crc kubenswrapper[5025]: I1007 08:30:29.182723 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerDied","Data":"5646f823285d74fdfff03f0a683c605a66760c8387b7007a95e2cbc423e85307"} Oct 07 08:30:30 crc kubenswrapper[5025]: I1007 08:30:30.192828 5025 generic.go:334] "Generic (PLEG): container finished" podID="5a2e702e-d565-4f63-a7cf-21465cf8d4fa" containerID="0d4d45b0aa181cf75952f59cf28c0c9532be78481327bc5f35d77617c55c128e" exitCode=0 Oct 07 08:30:30 crc kubenswrapper[5025]: I1007 08:30:30.192877 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerDied","Data":"0d4d45b0aa181cf75952f59cf28c0c9532be78481327bc5f35d77617c55c128e"} Oct 07 08:30:30 crc kubenswrapper[5025]: I1007 08:30:30.596649 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:30 crc kubenswrapper[5025]: I1007 08:30:30.596965 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:30 crc kubenswrapper[5025]: I1007 08:30:30.641853 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.054934 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mmddh" Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.211944 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerStarted","Data":"34684b4242cb08bfb4804a5767c51686f0583072d383665d16f7cfc7d26955a3"} Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.211993 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerStarted","Data":"05435f3ad6f65d597e464a5354c0127a3d05b5f626db24c342b484cd5a9aa2e0"} Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.212008 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerStarted","Data":"aabda687c6c3025f87d3820ca7973c80191a20b6dd4ba843c1caf598498c1c51"} Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.212020 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerStarted","Data":"d4a0fb4e226e8b561ee4034eda0c063d1ab2f7599d918820416f6a65c1a3c130"} Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.212033 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerStarted","Data":"c3104fa7dc99b30a91abcc3f735659f87037def53c74758119fc03f1ce599571"} Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.212045 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6z6lt" event={"ID":"5a2e702e-d565-4f63-a7cf-21465cf8d4fa","Type":"ContainerStarted","Data":"51f580ca2e8226cab8b185c8dc9bde118e408befe37e75e3f1f4bf223ab78695"} Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.238118 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-6z6lt" podStartSLOduration=5.25788954 podStartE2EDuration="12.238102752s" podCreationTimestamp="2025-10-07 08:30:19 +0000 UTC" firstStartedPulling="2025-10-07 08:30:20.169739858 +0000 UTC m=+826.979054002" lastFinishedPulling="2025-10-07 08:30:27.14995307 +0000 UTC m=+833.959267214" observedRunningTime="2025-10-07 08:30:31.229639285 +0000 UTC m=+838.038953429" watchObservedRunningTime="2025-10-07 08:30:31.238102752 +0000 UTC m=+838.047416896" Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.259417 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:31 crc kubenswrapper[5025]: I1007 08:30:31.296660 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89wwr"] Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.216893 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.528607 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7"] Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.530059 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.532800 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.541206 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7"] Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.721865 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.722299 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppdv\" (UniqueName: \"kubernetes.io/projected/fd984570-be38-4ee8-b94d-be13506a255c-kube-api-access-hppdv\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.722492 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.823894 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.824087 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppdv\" (UniqueName: \"kubernetes.io/projected/fd984570-be38-4ee8-b94d-be13506a255c-kube-api-access-hppdv\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.824222 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.825020 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.825069 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:32 crc kubenswrapper[5025]: I1007 08:30:32.852416 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppdv\" (UniqueName: \"kubernetes.io/projected/fd984570-be38-4ee8-b94d-be13506a255c-kube-api-access-hppdv\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.149414 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.228361 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-89wwr" podUID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerName="registry-server" containerID="cri-o://36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42" gracePeriod=2 Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.548281 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7"] Oct 07 08:30:33 crc kubenswrapper[5025]: W1007 08:30:33.554009 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd984570_be38_4ee8_b94d_be13506a255c.slice/crio-1bf1465d8220cb0c1d83918302be9f45a94e16db6ac313710141bf62f8db7508 WatchSource:0}: Error finding container 1bf1465d8220cb0c1d83918302be9f45a94e16db6ac313710141bf62f8db7508: Status 404 returned error can't find the container with id 1bf1465d8220cb0c1d83918302be9f45a94e16db6ac313710141bf62f8db7508 Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.585159 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.734754 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-utilities\") pod \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.735061 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slzkm\" (UniqueName: \"kubernetes.io/projected/ccb408bb-4a59-4fc2-8060-e535c53bcc70-kube-api-access-slzkm\") pod \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.735124 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-catalog-content\") pod \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\" (UID: \"ccb408bb-4a59-4fc2-8060-e535c53bcc70\") " Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.735881 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-utilities" (OuterVolumeSpecName: "utilities") pod "ccb408bb-4a59-4fc2-8060-e535c53bcc70" (UID: "ccb408bb-4a59-4fc2-8060-e535c53bcc70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.742896 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb408bb-4a59-4fc2-8060-e535c53bcc70-kube-api-access-slzkm" (OuterVolumeSpecName: "kube-api-access-slzkm") pod "ccb408bb-4a59-4fc2-8060-e535c53bcc70" (UID: "ccb408bb-4a59-4fc2-8060-e535c53bcc70"). InnerVolumeSpecName "kube-api-access-slzkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.793881 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccb408bb-4a59-4fc2-8060-e535c53bcc70" (UID: "ccb408bb-4a59-4fc2-8060-e535c53bcc70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.836960 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.837004 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb408bb-4a59-4fc2-8060-e535c53bcc70-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:33 crc kubenswrapper[5025]: I1007 08:30:33.837019 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slzkm\" (UniqueName: \"kubernetes.io/projected/ccb408bb-4a59-4fc2-8060-e535c53bcc70-kube-api-access-slzkm\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.235821 5025 generic.go:334] "Generic (PLEG): container finished" podID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerID="36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42" exitCode=0 Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.236628 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89wwr" event={"ID":"ccb408bb-4a59-4fc2-8060-e535c53bcc70","Type":"ContainerDied","Data":"36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42"} Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.236658 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89wwr" event={"ID":"ccb408bb-4a59-4fc2-8060-e535c53bcc70","Type":"ContainerDied","Data":"8e5695a7997465a8b7a8a9f6767021d05fe95f8659411077f9a3626fd77911ee"} Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.236675 5025 scope.go:117] "RemoveContainer" containerID="36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.236746 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89wwr" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.241323 5025 generic.go:334] "Generic (PLEG): container finished" podID="fd984570-be38-4ee8-b94d-be13506a255c" containerID="de9a8c13455d8d9ba0eba4e23401ecba311cb46794a6c9d868aefe33aa12f9fa" exitCode=0 Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.241355 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" event={"ID":"fd984570-be38-4ee8-b94d-be13506a255c","Type":"ContainerDied","Data":"de9a8c13455d8d9ba0eba4e23401ecba311cb46794a6c9d868aefe33aa12f9fa"} Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.241378 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" event={"ID":"fd984570-be38-4ee8-b94d-be13506a255c","Type":"ContainerStarted","Data":"1bf1465d8220cb0c1d83918302be9f45a94e16db6ac313710141bf62f8db7508"} Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.255892 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89wwr"] Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.260030 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-89wwr"] Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.267705 5025 scope.go:117] "RemoveContainer" containerID="0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.289370 5025 scope.go:117] "RemoveContainer" containerID="1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.323066 5025 scope.go:117] "RemoveContainer" containerID="36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42" Oct 07 08:30:34 crc kubenswrapper[5025]: E1007 08:30:34.323506 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42\": container with ID starting with 36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42 not found: ID does not exist" containerID="36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.323567 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42"} err="failed to get container status \"36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42\": rpc error: code = NotFound desc = could not find container \"36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42\": container with ID starting with 36115b61844f1b10faa093ce17f3508ffb71f880145fe925ee73970914052a42 not found: ID does not exist" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.323588 5025 scope.go:117] "RemoveContainer" containerID="0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f" Oct 07 08:30:34 crc kubenswrapper[5025]: E1007 08:30:34.323857 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f\": container with ID starting with 0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f not found: ID does not exist" containerID="0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.323902 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f"} err="failed to get container status \"0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f\": rpc error: code = NotFound desc = could not find container \"0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f\": container with ID starting with 0eb2ab453a6fd8246a1623e076d111674226e7e3a6171c9ebda389f5d291034f not found: ID does not exist" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.323929 5025 scope.go:117] "RemoveContainer" containerID="1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446" Oct 07 08:30:34 crc kubenswrapper[5025]: E1007 08:30:34.324238 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446\": container with ID starting with 1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446 not found: ID does not exist" containerID="1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.324266 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446"} err="failed to get container status \"1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446\": rpc error: code = NotFound desc = could not find container \"1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446\": container with ID starting with 1bfea52255576555ae9b3861c33addc4bb2ea134f59cc67b4229b93c4f4a6446 not found: ID does not exist" Oct 07 08:30:34 crc kubenswrapper[5025]: I1007 08:30:34.996452 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:35 crc kubenswrapper[5025]: I1007 08:30:35.045571 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:35 crc kubenswrapper[5025]: I1007 08:30:35.923302 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" path="/var/lib/kubelet/pods/ccb408bb-4a59-4fc2-8060-e535c53bcc70/volumes" Oct 07 08:30:39 crc kubenswrapper[5025]: I1007 08:30:39.272917 5025 generic.go:334] "Generic (PLEG): container finished" podID="fd984570-be38-4ee8-b94d-be13506a255c" containerID="e87816898d6f55bac654c9b92a685c36a6e2c16a02eeefcda4b059253fd7cf6c" exitCode=0 Oct 07 08:30:39 crc kubenswrapper[5025]: I1007 08:30:39.272977 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" event={"ID":"fd984570-be38-4ee8-b94d-be13506a255c","Type":"ContainerDied","Data":"e87816898d6f55bac654c9b92a685c36a6e2c16a02eeefcda4b059253fd7cf6c"} Oct 07 08:30:39 crc kubenswrapper[5025]: I1007 08:30:39.561238 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-7h8g2" Oct 07 08:30:39 crc kubenswrapper[5025]: I1007 08:30:39.979113 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-48bhp" Oct 07 08:30:40 crc kubenswrapper[5025]: I1007 08:30:40.280634 5025 generic.go:334] "Generic (PLEG): container finished" podID="fd984570-be38-4ee8-b94d-be13506a255c" containerID="a81d3c452cd08032c8376df2f1cd58f90f29032ddb506c8300283ff773fe87e6" exitCode=0 Oct 07 08:30:40 crc kubenswrapper[5025]: I1007 08:30:40.280723 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" event={"ID":"fd984570-be38-4ee8-b94d-be13506a255c","Type":"ContainerDied","Data":"a81d3c452cd08032c8376df2f1cd58f90f29032ddb506c8300283ff773fe87e6"} Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.565048 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.731324 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-bundle\") pod \"fd984570-be38-4ee8-b94d-be13506a255c\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.731430 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hppdv\" (UniqueName: \"kubernetes.io/projected/fd984570-be38-4ee8-b94d-be13506a255c-kube-api-access-hppdv\") pod \"fd984570-be38-4ee8-b94d-be13506a255c\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.731558 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-util\") pod \"fd984570-be38-4ee8-b94d-be13506a255c\" (UID: \"fd984570-be38-4ee8-b94d-be13506a255c\") " Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.732973 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-bundle" (OuterVolumeSpecName: "bundle") pod "fd984570-be38-4ee8-b94d-be13506a255c" (UID: "fd984570-be38-4ee8-b94d-be13506a255c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.741136 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd984570-be38-4ee8-b94d-be13506a255c-kube-api-access-hppdv" (OuterVolumeSpecName: "kube-api-access-hppdv") pod "fd984570-be38-4ee8-b94d-be13506a255c" (UID: "fd984570-be38-4ee8-b94d-be13506a255c"). InnerVolumeSpecName "kube-api-access-hppdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.747654 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-util" (OuterVolumeSpecName: "util") pod "fd984570-be38-4ee8-b94d-be13506a255c" (UID: "fd984570-be38-4ee8-b94d-be13506a255c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.833118 5025 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.833168 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hppdv\" (UniqueName: \"kubernetes.io/projected/fd984570-be38-4ee8-b94d-be13506a255c-kube-api-access-hppdv\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:41 crc kubenswrapper[5025]: I1007 08:30:41.833184 5025 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd984570-be38-4ee8-b94d-be13506a255c-util\") on node \"crc\" DevicePath \"\"" Oct 07 08:30:42 crc kubenswrapper[5025]: I1007 08:30:42.295134 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" event={"ID":"fd984570-be38-4ee8-b94d-be13506a255c","Type":"ContainerDied","Data":"1bf1465d8220cb0c1d83918302be9f45a94e16db6ac313710141bf62f8db7508"} Oct 07 08:30:42 crc kubenswrapper[5025]: I1007 08:30:42.295173 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bf1465d8220cb0c1d83918302be9f45a94e16db6ac313710141bf62f8db7508" Oct 07 08:30:42 crc kubenswrapper[5025]: I1007 08:30:42.295206 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.829401 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj"] Oct 07 08:30:44 crc kubenswrapper[5025]: E1007 08:30:44.830346 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd984570-be38-4ee8-b94d-be13506a255c" containerName="pull" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.830377 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd984570-be38-4ee8-b94d-be13506a255c" containerName="pull" Oct 07 08:30:44 crc kubenswrapper[5025]: E1007 08:30:44.830416 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd984570-be38-4ee8-b94d-be13506a255c" containerName="extract" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.830434 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd984570-be38-4ee8-b94d-be13506a255c" containerName="extract" Oct 07 08:30:44 crc kubenswrapper[5025]: E1007 08:30:44.830460 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerName="extract-content" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.830480 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerName="extract-content" Oct 07 08:30:44 crc kubenswrapper[5025]: E1007 08:30:44.830503 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerName="registry-server" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.830519 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerName="registry-server" Oct 07 08:30:44 crc kubenswrapper[5025]: E1007 08:30:44.830582 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerName="extract-utilities" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.830599 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerName="extract-utilities" Oct 07 08:30:44 crc kubenswrapper[5025]: E1007 08:30:44.830638 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd984570-be38-4ee8-b94d-be13506a255c" containerName="util" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.830654 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd984570-be38-4ee8-b94d-be13506a255c" containerName="util" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.830926 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb408bb-4a59-4fc2-8060-e535c53bcc70" containerName="registry-server" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.830979 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd984570-be38-4ee8-b94d-be13506a255c" containerName="extract" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.831952 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.839264 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.839648 5025 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-h4bx7" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.839878 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.852535 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj"] Oct 07 08:30:44 crc kubenswrapper[5025]: I1007 08:30:44.980818 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgncx\" (UniqueName: \"kubernetes.io/projected/e2911c52-ba3c-4b4a-9603-6fd0d012781b-kube-api-access-dgncx\") pod \"cert-manager-operator-controller-manager-57cd46d6d-pmkhj\" (UID: \"e2911c52-ba3c-4b4a-9603-6fd0d012781b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj" Oct 07 08:30:45 crc kubenswrapper[5025]: I1007 08:30:45.082343 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgncx\" (UniqueName: \"kubernetes.io/projected/e2911c52-ba3c-4b4a-9603-6fd0d012781b-kube-api-access-dgncx\") pod \"cert-manager-operator-controller-manager-57cd46d6d-pmkhj\" (UID: \"e2911c52-ba3c-4b4a-9603-6fd0d012781b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj" Oct 07 08:30:45 crc kubenswrapper[5025]: I1007 08:30:45.103288 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgncx\" (UniqueName: \"kubernetes.io/projected/e2911c52-ba3c-4b4a-9603-6fd0d012781b-kube-api-access-dgncx\") pod \"cert-manager-operator-controller-manager-57cd46d6d-pmkhj\" (UID: \"e2911c52-ba3c-4b4a-9603-6fd0d012781b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj" Oct 07 08:30:45 crc kubenswrapper[5025]: I1007 08:30:45.183364 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj" Oct 07 08:30:45 crc kubenswrapper[5025]: I1007 08:30:45.720281 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj"] Oct 07 08:30:46 crc kubenswrapper[5025]: I1007 08:30:46.320458 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj" event={"ID":"e2911c52-ba3c-4b4a-9603-6fd0d012781b","Type":"ContainerStarted","Data":"ab278e8df60be68bf8159ae14ba0f80fe7de669bf61a1e1bf97a461ba5e9932c"} Oct 07 08:30:49 crc kubenswrapper[5025]: I1007 08:30:49.999665 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6z6lt" Oct 07 08:30:53 crc kubenswrapper[5025]: I1007 08:30:53.365877 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj" event={"ID":"e2911c52-ba3c-4b4a-9603-6fd0d012781b","Type":"ContainerStarted","Data":"b16e1ec617311adee493bfa56329463140bd28932dc9f9b35b20ea9f8d401cc3"} Oct 07 08:30:53 crc kubenswrapper[5025]: I1007 08:30:53.391469 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-pmkhj" podStartSLOduration=2.046875306 podStartE2EDuration="9.391451104s" podCreationTimestamp="2025-10-07 08:30:44 +0000 UTC" firstStartedPulling="2025-10-07 08:30:45.733007449 +0000 UTC m=+852.542321593" lastFinishedPulling="2025-10-07 08:30:53.077583257 +0000 UTC m=+859.886897391" observedRunningTime="2025-10-07 08:30:53.385717744 +0000 UTC m=+860.195031888" watchObservedRunningTime="2025-10-07 08:30:53.391451104 +0000 UTC m=+860.200765258" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.080443 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-rbg4h"] Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.081458 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.083363 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.083924 5025 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pf846" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.083998 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.089002 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-rbg4h"] Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.228683 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmqh\" (UniqueName: \"kubernetes.io/projected/56c29b08-1f19-4338-a43e-fc49b860b93b-kube-api-access-bpmqh\") pod \"cert-manager-webhook-d969966f-rbg4h\" (UID: \"56c29b08-1f19-4338-a43e-fc49b860b93b\") " pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.228756 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56c29b08-1f19-4338-a43e-fc49b860b93b-bound-sa-token\") pod \"cert-manager-webhook-d969966f-rbg4h\" (UID: \"56c29b08-1f19-4338-a43e-fc49b860b93b\") " pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.329505 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56c29b08-1f19-4338-a43e-fc49b860b93b-bound-sa-token\") pod \"cert-manager-webhook-d969966f-rbg4h\" (UID: \"56c29b08-1f19-4338-a43e-fc49b860b93b\") " pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.329652 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmqh\" (UniqueName: \"kubernetes.io/projected/56c29b08-1f19-4338-a43e-fc49b860b93b-kube-api-access-bpmqh\") pod \"cert-manager-webhook-d969966f-rbg4h\" (UID: \"56c29b08-1f19-4338-a43e-fc49b860b93b\") " pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.349555 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmqh\" (UniqueName: \"kubernetes.io/projected/56c29b08-1f19-4338-a43e-fc49b860b93b-kube-api-access-bpmqh\") pod \"cert-manager-webhook-d969966f-rbg4h\" (UID: \"56c29b08-1f19-4338-a43e-fc49b860b93b\") " pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.353131 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56c29b08-1f19-4338-a43e-fc49b860b93b-bound-sa-token\") pod \"cert-manager-webhook-d969966f-rbg4h\" (UID: \"56c29b08-1f19-4338-a43e-fc49b860b93b\") " pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.398437 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:30:56 crc kubenswrapper[5025]: I1007 08:30:56.854147 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-rbg4h"] Oct 07 08:30:57 crc kubenswrapper[5025]: I1007 08:30:57.388885 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" event={"ID":"56c29b08-1f19-4338-a43e-fc49b860b93b","Type":"ContainerStarted","Data":"64c9b70e81a686c049df5d202a25fe0d6465022b6cd0abcc1868775533ac98d5"} Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.768662 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7"] Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.770817 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.774141 5025 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wdrmg" Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.782435 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7"] Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.786984 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjh7k\" (UniqueName: \"kubernetes.io/projected/dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2-kube-api-access-tjh7k\") pod \"cert-manager-cainjector-7d9f95dbf-nxrz7\" (UID: \"dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.787097 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-nxrz7\" (UID: \"dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.888235 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjh7k\" (UniqueName: \"kubernetes.io/projected/dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2-kube-api-access-tjh7k\") pod \"cert-manager-cainjector-7d9f95dbf-nxrz7\" (UID: \"dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.888348 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-nxrz7\" (UID: \"dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.911093 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjh7k\" (UniqueName: \"kubernetes.io/projected/dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2-kube-api-access-tjh7k\") pod \"cert-manager-cainjector-7d9f95dbf-nxrz7\" (UID: \"dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" Oct 07 08:30:59 crc kubenswrapper[5025]: I1007 08:30:59.912682 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-nxrz7\" (UID: \"dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" Oct 07 08:31:00 crc kubenswrapper[5025]: I1007 08:31:00.129289 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" Oct 07 08:31:01 crc kubenswrapper[5025]: I1007 08:31:01.339268 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7"] Oct 07 08:31:01 crc kubenswrapper[5025]: W1007 08:31:01.345054 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd36c502_6bcb_4bc0_9761_b0cd2f2f6aa2.slice/crio-e822211e2064c56aa3c2d8cf94b41c70092494f4e926ff8752e5b64c0550b891 WatchSource:0}: Error finding container e822211e2064c56aa3c2d8cf94b41c70092494f4e926ff8752e5b64c0550b891: Status 404 returned error can't find the container with id e822211e2064c56aa3c2d8cf94b41c70092494f4e926ff8752e5b64c0550b891 Oct 07 08:31:01 crc kubenswrapper[5025]: I1007 08:31:01.426094 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" event={"ID":"dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2","Type":"ContainerStarted","Data":"e822211e2064c56aa3c2d8cf94b41c70092494f4e926ff8752e5b64c0550b891"} Oct 07 08:31:01 crc kubenswrapper[5025]: I1007 08:31:01.427562 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" event={"ID":"56c29b08-1f19-4338-a43e-fc49b860b93b","Type":"ContainerStarted","Data":"a8a8cb162ea1413f94730e0134169b2c9ceab64e7fee7d0ed09185c9d9e37caf"} Oct 07 08:31:01 crc kubenswrapper[5025]: I1007 08:31:01.427690 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:31:01 crc kubenswrapper[5025]: I1007 08:31:01.448262 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" podStartSLOduration=1.3247258149999999 podStartE2EDuration="5.44824275s" podCreationTimestamp="2025-10-07 08:30:56 +0000 UTC" firstStartedPulling="2025-10-07 08:30:56.865692608 +0000 UTC m=+863.675006762" lastFinishedPulling="2025-10-07 08:31:00.989209553 +0000 UTC m=+867.798523697" observedRunningTime="2025-10-07 08:31:01.442866051 +0000 UTC m=+868.252180215" watchObservedRunningTime="2025-10-07 08:31:01.44824275 +0000 UTC m=+868.257556894" Oct 07 08:31:02 crc kubenswrapper[5025]: I1007 08:31:02.437207 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" event={"ID":"dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2","Type":"ContainerStarted","Data":"59b4ae912003955855b9baefd5f8d3b2f16536f18cf8a5014a377e4867d424b6"} Oct 07 08:31:02 crc kubenswrapper[5025]: I1007 08:31:02.453466 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-nxrz7" podStartSLOduration=3.45344649 podStartE2EDuration="3.45344649s" podCreationTimestamp="2025-10-07 08:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:31:02.452251603 +0000 UTC m=+869.261565747" watchObservedRunningTime="2025-10-07 08:31:02.45344649 +0000 UTC m=+869.262760634" Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.403097 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-rbg4h" Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.723163 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-2rl67"] Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.723964 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.727405 5025 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-h7c47" Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.745654 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-2rl67"] Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.787920 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bcb693f-b7c9-4e81-be45-ba0196498e60-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-2rl67\" (UID: \"9bcb693f-b7c9-4e81-be45-ba0196498e60\") " pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.788028 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wlls\" (UniqueName: \"kubernetes.io/projected/9bcb693f-b7c9-4e81-be45-ba0196498e60-kube-api-access-2wlls\") pod \"cert-manager-7d4cc89fcb-2rl67\" (UID: \"9bcb693f-b7c9-4e81-be45-ba0196498e60\") " pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.889109 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bcb693f-b7c9-4e81-be45-ba0196498e60-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-2rl67\" (UID: \"9bcb693f-b7c9-4e81-be45-ba0196498e60\") " pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.889187 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wlls\" (UniqueName: \"kubernetes.io/projected/9bcb693f-b7c9-4e81-be45-ba0196498e60-kube-api-access-2wlls\") pod \"cert-manager-7d4cc89fcb-2rl67\" (UID: \"9bcb693f-b7c9-4e81-be45-ba0196498e60\") " pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.913694 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wlls\" (UniqueName: \"kubernetes.io/projected/9bcb693f-b7c9-4e81-be45-ba0196498e60-kube-api-access-2wlls\") pod \"cert-manager-7d4cc89fcb-2rl67\" (UID: \"9bcb693f-b7c9-4e81-be45-ba0196498e60\") " pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" Oct 07 08:31:06 crc kubenswrapper[5025]: I1007 08:31:06.915293 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bcb693f-b7c9-4e81-be45-ba0196498e60-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-2rl67\" (UID: \"9bcb693f-b7c9-4e81-be45-ba0196498e60\") " pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" Oct 07 08:31:07 crc kubenswrapper[5025]: I1007 08:31:07.043197 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" Oct 07 08:31:07 crc kubenswrapper[5025]: I1007 08:31:07.477328 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-2rl67"] Oct 07 08:31:08 crc kubenswrapper[5025]: I1007 08:31:08.486098 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" event={"ID":"9bcb693f-b7c9-4e81-be45-ba0196498e60","Type":"ContainerStarted","Data":"1e8be1d5206318dca4a7297189489a587f632971e6d11cf6e9a4eb04bbf22ea1"} Oct 07 08:31:08 crc kubenswrapper[5025]: I1007 08:31:08.486409 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" event={"ID":"9bcb693f-b7c9-4e81-be45-ba0196498e60","Type":"ContainerStarted","Data":"048359361f37849510c3ab976ad883d73d7c7acf0959d68318d6cd11a152e527"} Oct 07 08:31:08 crc kubenswrapper[5025]: I1007 08:31:08.507762 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-2rl67" podStartSLOduration=2.507734234 podStartE2EDuration="2.507734234s" podCreationTimestamp="2025-10-07 08:31:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:31:08.501177318 +0000 UTC m=+875.310491472" watchObservedRunningTime="2025-10-07 08:31:08.507734234 +0000 UTC m=+875.317048388" Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.016878 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-pwqhb"] Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.022238 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pwqhb" Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.034167 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.034657 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8kn4r" Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.034936 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.073585 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pwqhb"] Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.176500 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljnn\" (UniqueName: \"kubernetes.io/projected/62a29229-3a4e-4e1e-8ecd-13535f8154e2-kube-api-access-tljnn\") pod \"openstack-operator-index-pwqhb\" (UID: \"62a29229-3a4e-4e1e-8ecd-13535f8154e2\") " pod="openstack-operators/openstack-operator-index-pwqhb" Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.278595 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tljnn\" (UniqueName: \"kubernetes.io/projected/62a29229-3a4e-4e1e-8ecd-13535f8154e2-kube-api-access-tljnn\") pod \"openstack-operator-index-pwqhb\" (UID: \"62a29229-3a4e-4e1e-8ecd-13535f8154e2\") " pod="openstack-operators/openstack-operator-index-pwqhb" Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.297821 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljnn\" (UniqueName: \"kubernetes.io/projected/62a29229-3a4e-4e1e-8ecd-13535f8154e2-kube-api-access-tljnn\") pod \"openstack-operator-index-pwqhb\" (UID: \"62a29229-3a4e-4e1e-8ecd-13535f8154e2\") " pod="openstack-operators/openstack-operator-index-pwqhb" Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.356003 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pwqhb" Oct 07 08:31:20 crc kubenswrapper[5025]: I1007 08:31:20.782080 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pwqhb"] Oct 07 08:31:20 crc kubenswrapper[5025]: W1007 08:31:20.799761 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62a29229_3a4e_4e1e_8ecd_13535f8154e2.slice/crio-615cf48635a496cbcb209296ffaf28444415f914f1467ed9f8ba43b5c88b0d75 WatchSource:0}: Error finding container 615cf48635a496cbcb209296ffaf28444415f914f1467ed9f8ba43b5c88b0d75: Status 404 returned error can't find the container with id 615cf48635a496cbcb209296ffaf28444415f914f1467ed9f8ba43b5c88b0d75 Oct 07 08:31:21 crc kubenswrapper[5025]: I1007 08:31:21.578660 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pwqhb" event={"ID":"62a29229-3a4e-4e1e-8ecd-13535f8154e2","Type":"ContainerStarted","Data":"615cf48635a496cbcb209296ffaf28444415f914f1467ed9f8ba43b5c88b0d75"} Oct 07 08:31:23 crc kubenswrapper[5025]: I1007 08:31:23.189406 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-pwqhb"] Oct 07 08:31:23 crc kubenswrapper[5025]: I1007 08:31:23.593703 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pwqhb" event={"ID":"62a29229-3a4e-4e1e-8ecd-13535f8154e2","Type":"ContainerStarted","Data":"01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8"} Oct 07 08:31:23 crc kubenswrapper[5025]: I1007 08:31:23.614746 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-pwqhb" podStartSLOduration=2.726094236 podStartE2EDuration="4.614722881s" podCreationTimestamp="2025-10-07 08:31:19 +0000 UTC" firstStartedPulling="2025-10-07 08:31:20.802122647 +0000 UTC m=+887.611436811" lastFinishedPulling="2025-10-07 08:31:22.690751312 +0000 UTC m=+889.500065456" observedRunningTime="2025-10-07 08:31:23.614484053 +0000 UTC m=+890.423798237" watchObservedRunningTime="2025-10-07 08:31:23.614722881 +0000 UTC m=+890.424037065" Oct 07 08:31:23 crc kubenswrapper[5025]: I1007 08:31:23.794619 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4p5nd"] Oct 07 08:31:23 crc kubenswrapper[5025]: I1007 08:31:23.796240 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4p5nd" Oct 07 08:31:23 crc kubenswrapper[5025]: I1007 08:31:23.804377 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4p5nd"] Oct 07 08:31:23 crc kubenswrapper[5025]: I1007 08:31:23.928990 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dmrj\" (UniqueName: \"kubernetes.io/projected/882d12c5-6ce7-4f45-8703-48feed573896-kube-api-access-9dmrj\") pod \"openstack-operator-index-4p5nd\" (UID: \"882d12c5-6ce7-4f45-8703-48feed573896\") " pod="openstack-operators/openstack-operator-index-4p5nd" Oct 07 08:31:24 crc kubenswrapper[5025]: I1007 08:31:24.030335 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dmrj\" (UniqueName: \"kubernetes.io/projected/882d12c5-6ce7-4f45-8703-48feed573896-kube-api-access-9dmrj\") pod \"openstack-operator-index-4p5nd\" (UID: \"882d12c5-6ce7-4f45-8703-48feed573896\") " pod="openstack-operators/openstack-operator-index-4p5nd" Oct 07 08:31:24 crc kubenswrapper[5025]: I1007 08:31:24.067445 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dmrj\" (UniqueName: \"kubernetes.io/projected/882d12c5-6ce7-4f45-8703-48feed573896-kube-api-access-9dmrj\") pod \"openstack-operator-index-4p5nd\" (UID: \"882d12c5-6ce7-4f45-8703-48feed573896\") " pod="openstack-operators/openstack-operator-index-4p5nd" Oct 07 08:31:24 crc kubenswrapper[5025]: I1007 08:31:24.129405 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4p5nd" Oct 07 08:31:24 crc kubenswrapper[5025]: I1007 08:31:24.606157 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4p5nd"] Oct 07 08:31:24 crc kubenswrapper[5025]: I1007 08:31:24.608048 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-pwqhb" podUID="62a29229-3a4e-4e1e-8ecd-13535f8154e2" containerName="registry-server" containerID="cri-o://01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8" gracePeriod=2 Oct 07 08:31:24 crc kubenswrapper[5025]: W1007 08:31:24.612420 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod882d12c5_6ce7_4f45_8703_48feed573896.slice/crio-28bf7ac3ad39c3586bb62d3d2dcb27a1eb1cafdd7b4247e9a666bdab63ea0d85 WatchSource:0}: Error finding container 28bf7ac3ad39c3586bb62d3d2dcb27a1eb1cafdd7b4247e9a666bdab63ea0d85: Status 404 returned error can't find the container with id 28bf7ac3ad39c3586bb62d3d2dcb27a1eb1cafdd7b4247e9a666bdab63ea0d85 Oct 07 08:31:24 crc kubenswrapper[5025]: I1007 08:31:24.956661 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pwqhb" Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.146695 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tljnn\" (UniqueName: \"kubernetes.io/projected/62a29229-3a4e-4e1e-8ecd-13535f8154e2-kube-api-access-tljnn\") pod \"62a29229-3a4e-4e1e-8ecd-13535f8154e2\" (UID: \"62a29229-3a4e-4e1e-8ecd-13535f8154e2\") " Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.154327 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a29229-3a4e-4e1e-8ecd-13535f8154e2-kube-api-access-tljnn" (OuterVolumeSpecName: "kube-api-access-tljnn") pod "62a29229-3a4e-4e1e-8ecd-13535f8154e2" (UID: "62a29229-3a4e-4e1e-8ecd-13535f8154e2"). InnerVolumeSpecName "kube-api-access-tljnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.249257 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tljnn\" (UniqueName: \"kubernetes.io/projected/62a29229-3a4e-4e1e-8ecd-13535f8154e2-kube-api-access-tljnn\") on node \"crc\" DevicePath \"\"" Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.618198 5025 generic.go:334] "Generic (PLEG): container finished" podID="62a29229-3a4e-4e1e-8ecd-13535f8154e2" containerID="01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8" exitCode=0 Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.618272 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pwqhb" Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.618265 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pwqhb" event={"ID":"62a29229-3a4e-4e1e-8ecd-13535f8154e2","Type":"ContainerDied","Data":"01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8"} Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.618444 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pwqhb" event={"ID":"62a29229-3a4e-4e1e-8ecd-13535f8154e2","Type":"ContainerDied","Data":"615cf48635a496cbcb209296ffaf28444415f914f1467ed9f8ba43b5c88b0d75"} Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.618469 5025 scope.go:117] "RemoveContainer" containerID="01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8" Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.624047 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4p5nd" event={"ID":"882d12c5-6ce7-4f45-8703-48feed573896","Type":"ContainerStarted","Data":"7816c758c5e5c0b9d77cc9c0a0cde6aa00c8dcba143955f58c40c31fa204641a"} Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.624111 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4p5nd" event={"ID":"882d12c5-6ce7-4f45-8703-48feed573896","Type":"ContainerStarted","Data":"28bf7ac3ad39c3586bb62d3d2dcb27a1eb1cafdd7b4247e9a666bdab63ea0d85"} Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.653874 5025 scope.go:117] "RemoveContainer" containerID="01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8" Oct 07 08:31:25 crc kubenswrapper[5025]: E1007 08:31:25.658291 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8\": container with ID starting with 01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8 not found: ID does not exist" containerID="01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8" Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.658373 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8"} err="failed to get container status \"01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8\": rpc error: code = NotFound desc = could not find container \"01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8\": container with ID starting with 01502eb4ef1684f151dab4a03760c3cf2d4b7b375df1c9ed7a247a75345fd0f8 not found: ID does not exist" Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.660326 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4p5nd" podStartSLOduration=2.61263594 podStartE2EDuration="2.660300306s" podCreationTimestamp="2025-10-07 08:31:23 +0000 UTC" firstStartedPulling="2025-10-07 08:31:24.617706112 +0000 UTC m=+891.427020256" lastFinishedPulling="2025-10-07 08:31:24.665370478 +0000 UTC m=+891.474684622" observedRunningTime="2025-10-07 08:31:25.649417684 +0000 UTC m=+892.458731858" watchObservedRunningTime="2025-10-07 08:31:25.660300306 +0000 UTC m=+892.469614490" Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.678704 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-pwqhb"] Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.685273 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-pwqhb"] Oct 07 08:31:25 crc kubenswrapper[5025]: I1007 08:31:25.927424 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a29229-3a4e-4e1e-8ecd-13535f8154e2" path="/var/lib/kubelet/pods/62a29229-3a4e-4e1e-8ecd-13535f8154e2/volumes" Oct 07 08:31:34 crc kubenswrapper[5025]: I1007 08:31:34.130695 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-4p5nd" Oct 07 08:31:34 crc kubenswrapper[5025]: I1007 08:31:34.131193 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-4p5nd" Oct 07 08:31:34 crc kubenswrapper[5025]: I1007 08:31:34.156313 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-4p5nd" Oct 07 08:31:34 crc kubenswrapper[5025]: I1007 08:31:34.734842 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-4p5nd" Oct 07 08:31:36 crc kubenswrapper[5025]: I1007 08:31:36.855206 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7"] Oct 07 08:31:36 crc kubenswrapper[5025]: E1007 08:31:36.856031 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a29229-3a4e-4e1e-8ecd-13535f8154e2" containerName="registry-server" Oct 07 08:31:36 crc kubenswrapper[5025]: I1007 08:31:36.856057 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a29229-3a4e-4e1e-8ecd-13535f8154e2" containerName="registry-server" Oct 07 08:31:36 crc kubenswrapper[5025]: I1007 08:31:36.856274 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a29229-3a4e-4e1e-8ecd-13535f8154e2" containerName="registry-server" Oct 07 08:31:36 crc kubenswrapper[5025]: I1007 08:31:36.857975 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:36 crc kubenswrapper[5025]: I1007 08:31:36.862670 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lkxgr" Oct 07 08:31:36 crc kubenswrapper[5025]: I1007 08:31:36.870899 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7"] Oct 07 08:31:36 crc kubenswrapper[5025]: I1007 08:31:36.913304 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-bundle\") pod \"dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:36 crc kubenswrapper[5025]: I1007 08:31:36.913371 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pbtx\" (UniqueName: \"kubernetes.io/projected/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-kube-api-access-8pbtx\") pod \"dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:36 crc kubenswrapper[5025]: I1007 08:31:36.913409 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-util\") pod \"dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.014677 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pbtx\" (UniqueName: \"kubernetes.io/projected/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-kube-api-access-8pbtx\") pod \"dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.014745 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-util\") pod \"dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.015352 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-bundle\") pod \"dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.015654 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-util\") pod \"dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.016072 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-bundle\") pod \"dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.039423 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pbtx\" (UniqueName: \"kubernetes.io/projected/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-kube-api-access-8pbtx\") pod \"dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.187320 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.487588 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7"] Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.732350 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" event={"ID":"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf","Type":"ContainerStarted","Data":"87d33f199de4cda60ea23e94855717ce10a063bc1ab15c785ed1740fe5c0f0db"} Oct 07 08:31:37 crc kubenswrapper[5025]: I1007 08:31:37.732994 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" event={"ID":"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf","Type":"ContainerStarted","Data":"dcf66cddd8611c1f57652c70afc479bf78eafa72fe19088dcb02d0a9143ba808"} Oct 07 08:31:38 crc kubenswrapper[5025]: I1007 08:31:38.742784 5025 generic.go:334] "Generic (PLEG): container finished" podID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerID="87d33f199de4cda60ea23e94855717ce10a063bc1ab15c785ed1740fe5c0f0db" exitCode=0 Oct 07 08:31:38 crc kubenswrapper[5025]: I1007 08:31:38.742855 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" event={"ID":"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf","Type":"ContainerDied","Data":"87d33f199de4cda60ea23e94855717ce10a063bc1ab15c785ed1740fe5c0f0db"} Oct 07 08:31:39 crc kubenswrapper[5025]: I1007 08:31:39.755089 5025 generic.go:334] "Generic (PLEG): container finished" podID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerID="83128e2fd9137e67e9918196f192ba7420cfe4cf37ef18af17a465b6d6d2539d" exitCode=0 Oct 07 08:31:39 crc kubenswrapper[5025]: I1007 08:31:39.755172 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" event={"ID":"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf","Type":"ContainerDied","Data":"83128e2fd9137e67e9918196f192ba7420cfe4cf37ef18af17a465b6d6d2539d"} Oct 07 08:31:40 crc kubenswrapper[5025]: I1007 08:31:40.765322 5025 generic.go:334] "Generic (PLEG): container finished" podID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerID="16721e90fcfc54f1fcab97947b7cdd2cdf2cfd398078b4c029ea87c89a27efc1" exitCode=0 Oct 07 08:31:40 crc kubenswrapper[5025]: I1007 08:31:40.765374 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" event={"ID":"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf","Type":"ContainerDied","Data":"16721e90fcfc54f1fcab97947b7cdd2cdf2cfd398078b4c029ea87c89a27efc1"} Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.094242 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.104000 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-util\") pod \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.104100 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pbtx\" (UniqueName: \"kubernetes.io/projected/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-kube-api-access-8pbtx\") pod \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.104126 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-bundle\") pod \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\" (UID: \"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf\") " Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.105470 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-bundle" (OuterVolumeSpecName: "bundle") pod "95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" (UID: "95c596b1-3c21-4e10-a7ee-c1b6c9220ddf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.112700 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-kube-api-access-8pbtx" (OuterVolumeSpecName: "kube-api-access-8pbtx") pod "95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" (UID: "95c596b1-3c21-4e10-a7ee-c1b6c9220ddf"). InnerVolumeSpecName "kube-api-access-8pbtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.136396 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-util" (OuterVolumeSpecName: "util") pod "95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" (UID: "95c596b1-3c21-4e10-a7ee-c1b6c9220ddf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.205040 5025 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-util\") on node \"crc\" DevicePath \"\"" Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.205088 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pbtx\" (UniqueName: \"kubernetes.io/projected/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-kube-api-access-8pbtx\") on node \"crc\" DevicePath \"\"" Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.205101 5025 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95c596b1-3c21-4e10-a7ee-c1b6c9220ddf-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.783832 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" event={"ID":"95c596b1-3c21-4e10-a7ee-c1b6c9220ddf","Type":"ContainerDied","Data":"dcf66cddd8611c1f57652c70afc479bf78eafa72fe19088dcb02d0a9143ba808"} Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.783868 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcf66cddd8611c1f57652c70afc479bf78eafa72fe19088dcb02d0a9143ba808" Oct 07 08:31:42 crc kubenswrapper[5025]: I1007 08:31:42.783932 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7" Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.499138 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm"] Oct 07 08:31:49 crc kubenswrapper[5025]: E1007 08:31:49.499883 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerName="pull" Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.499896 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerName="pull" Oct 07 08:31:49 crc kubenswrapper[5025]: E1007 08:31:49.499905 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerName="extract" Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.499910 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerName="extract" Oct 07 08:31:49 crc kubenswrapper[5025]: E1007 08:31:49.499928 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerName="util" Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.499934 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerName="util" Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.500037 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c596b1-3c21-4e10-a7ee-c1b6c9220ddf" containerName="extract" Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.500622 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" Oct 07 08:31:49 crc kubenswrapper[5025]: W1007 08:31:49.508248 5025 reflector.go:561] object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-fkc82": failed to list *v1.Secret: secrets "openstack-operator-controller-operator-dockercfg-fkc82" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Oct 07 08:31:49 crc kubenswrapper[5025]: E1007 08:31:49.508309 5025 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"openstack-operator-controller-operator-dockercfg-fkc82\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-operator-controller-operator-dockercfg-fkc82\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.563338 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm"] Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.607234 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhg8z\" (UniqueName: \"kubernetes.io/projected/ca0d8394-2e0b-438e-b6d9-700a52a6f339-kube-api-access-nhg8z\") pod \"openstack-operator-controller-operator-7db4b69559-4wdwm\" (UID: \"ca0d8394-2e0b-438e-b6d9-700a52a6f339\") " pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.708426 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhg8z\" (UniqueName: \"kubernetes.io/projected/ca0d8394-2e0b-438e-b6d9-700a52a6f339-kube-api-access-nhg8z\") pod \"openstack-operator-controller-operator-7db4b69559-4wdwm\" (UID: \"ca0d8394-2e0b-438e-b6d9-700a52a6f339\") " pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" Oct 07 08:31:49 crc kubenswrapper[5025]: I1007 08:31:49.726907 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhg8z\" (UniqueName: \"kubernetes.io/projected/ca0d8394-2e0b-438e-b6d9-700a52a6f339-kube-api-access-nhg8z\") pod \"openstack-operator-controller-operator-7db4b69559-4wdwm\" (UID: \"ca0d8394-2e0b-438e-b6d9-700a52a6f339\") " pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" Oct 07 08:31:50 crc kubenswrapper[5025]: I1007 08:31:50.828525 5025 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" secret="" err="failed to sync secret cache: timed out waiting for the condition" Oct 07 08:31:50 crc kubenswrapper[5025]: I1007 08:31:50.828623 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" Oct 07 08:31:51 crc kubenswrapper[5025]: I1007 08:31:51.066904 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-fkc82" Oct 07 08:31:51 crc kubenswrapper[5025]: I1007 08:31:51.285422 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm"] Oct 07 08:31:51 crc kubenswrapper[5025]: I1007 08:31:51.842155 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" event={"ID":"ca0d8394-2e0b-438e-b6d9-700a52a6f339","Type":"ContainerStarted","Data":"3c7dac480322478ca8e08b144c9d1a49d4286549db438ea1b19651534d4ed167"} Oct 07 08:31:54 crc kubenswrapper[5025]: I1007 08:31:54.868986 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" event={"ID":"ca0d8394-2e0b-438e-b6d9-700a52a6f339","Type":"ContainerStarted","Data":"605bd869cd7ffc7aec5b5ab641c646fc0286c0b683c8485b3145e867acf8185a"} Oct 07 08:31:57 crc kubenswrapper[5025]: I1007 08:31:57.901732 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" event={"ID":"ca0d8394-2e0b-438e-b6d9-700a52a6f339","Type":"ContainerStarted","Data":"6cf4775c7b9f8a3ae4001a446200fe4becab88c3f4ffefed4cf09e84b95e410f"} Oct 07 08:31:57 crc kubenswrapper[5025]: I1007 08:31:57.902329 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" Oct 07 08:31:57 crc kubenswrapper[5025]: I1007 08:31:57.938392 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" podStartSLOduration=3.312457203 podStartE2EDuration="8.938372333s" podCreationTimestamp="2025-10-07 08:31:49 +0000 UTC" firstStartedPulling="2025-10-07 08:31:51.292244311 +0000 UTC m=+918.101558455" lastFinishedPulling="2025-10-07 08:31:56.918159441 +0000 UTC m=+923.727473585" observedRunningTime="2025-10-07 08:31:57.931865899 +0000 UTC m=+924.741180053" watchObservedRunningTime="2025-10-07 08:31:57.938372333 +0000 UTC m=+924.747686487" Oct 07 08:32:00 crc kubenswrapper[5025]: I1007 08:32:00.833158 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7db4b69559-4wdwm" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.458832 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.460519 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.463149 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.468056 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.471164 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5wh45" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.478772 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-tbqsg" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.478963 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.482764 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.493781 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.494785 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.497266 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-w65sd" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.519290 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.520266 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.527746 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-w2j4s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.531697 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.554289 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6dcb\" (UniqueName: \"kubernetes.io/projected/c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1-kube-api-access-c6dcb\") pod \"cinder-operator-controller-manager-7d4d4f8d-7tfgz\" (UID: \"c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.568176 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.581692 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.582851 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.587015 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rpscj" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.598215 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.619820 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.620740 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.626472 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xv6vz" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.638189 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.643521 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-66frw"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.644853 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.649624 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.650595 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.651318 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.651474 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nkzlz" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.653877 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lk42p" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.654019 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-66frw"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.656337 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wmmr\" (UniqueName: \"kubernetes.io/projected/f22649b0-7cea-4fb0-bc66-d1708cfa5630-kube-api-access-7wmmr\") pod \"glance-operator-controller-manager-5dc44df7d5-pfd8r\" (UID: \"f22649b0-7cea-4fb0-bc66-d1708cfa5630\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.656437 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6dcb\" (UniqueName: \"kubernetes.io/projected/c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1-kube-api-access-c6dcb\") pod \"cinder-operator-controller-manager-7d4d4f8d-7tfgz\" (UID: \"c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.656484 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbk8\" (UniqueName: \"kubernetes.io/projected/988dc4fe-2f1a-481a-9954-3578f833387e-kube-api-access-6vbk8\") pod \"barbican-operator-controller-manager-58c4cd55f4-p2fpt\" (UID: \"988dc4fe-2f1a-481a-9954-3578f833387e\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.656586 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qrf9\" (UniqueName: \"kubernetes.io/projected/ffa40450-8658-4d21-b4b1-1174c69e989f-kube-api-access-9qrf9\") pod \"designate-operator-controller-manager-75dfd9b554-75k2s\" (UID: \"ffa40450-8658-4d21-b4b1-1174c69e989f\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.662713 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.669412 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.670506 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.682360 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-26nv8" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.682488 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.683659 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.684923 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-82s29" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.689456 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.699371 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.702865 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.704119 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.706274 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.707395 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.710114 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-dfp5t" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.710415 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-t78zk" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.717299 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6dcb\" (UniqueName: \"kubernetes.io/projected/c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1-kube-api-access-c6dcb\") pod \"cinder-operator-controller-manager-7d4d4f8d-7tfgz\" (UID: \"c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.717376 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.718293 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.719458 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-26z4v" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.719706 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.747483 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.758514 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b79l\" (UniqueName: \"kubernetes.io/projected/d708eb23-1cca-4c1c-a1e5-7a68efa23a59-kube-api-access-7b79l\") pod \"infra-operator-controller-manager-658588b8c9-66frw\" (UID: \"d708eb23-1cca-4c1c-a1e5-7a68efa23a59\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.758605 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wmmr\" (UniqueName: \"kubernetes.io/projected/f22649b0-7cea-4fb0-bc66-d1708cfa5630-kube-api-access-7wmmr\") pod \"glance-operator-controller-manager-5dc44df7d5-pfd8r\" (UID: \"f22649b0-7cea-4fb0-bc66-d1708cfa5630\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.758645 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96flb\" (UniqueName: \"kubernetes.io/projected/d9b1f3d1-a00f-45df-ae19-728a8716aaa3-kube-api-access-96flb\") pod \"horizon-operator-controller-manager-76d5b87f47-crcmj\" (UID: \"d9b1f3d1-a00f-45df-ae19-728a8716aaa3\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.758672 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbk8\" (UniqueName: \"kubernetes.io/projected/988dc4fe-2f1a-481a-9954-3578f833387e-kube-api-access-6vbk8\") pod \"barbican-operator-controller-manager-58c4cd55f4-p2fpt\" (UID: \"988dc4fe-2f1a-481a-9954-3578f833387e\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.758703 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfn4x\" (UniqueName: \"kubernetes.io/projected/ded88cfb-86e3-4bcf-875c-285b6b34776b-kube-api-access-gfn4x\") pod \"manila-operator-controller-manager-65d89cfd9f-9ch9p\" (UID: \"ded88cfb-86e3-4bcf-875c-285b6b34776b\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.758732 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9zzs\" (UniqueName: \"kubernetes.io/projected/6dc83248-702c-40eb-92ce-99f686ea1bfc-kube-api-access-b9zzs\") pod \"heat-operator-controller-manager-54b4974c45-hlvw6\" (UID: \"6dc83248-702c-40eb-92ce-99f686ea1bfc\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.758751 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d708eb23-1cca-4c1c-a1e5-7a68efa23a59-cert\") pod \"infra-operator-controller-manager-658588b8c9-66frw\" (UID: \"d708eb23-1cca-4c1c-a1e5-7a68efa23a59\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.758786 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tch44\" (UniqueName: \"kubernetes.io/projected/46dec033-4c2a-4fd6-87fb-a877d35e258d-kube-api-access-tch44\") pod \"ironic-operator-controller-manager-649675d675-t9bsq\" (UID: \"46dec033-4c2a-4fd6-87fb-a877d35e258d\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.759166 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrf9\" (UniqueName: \"kubernetes.io/projected/ffa40450-8658-4d21-b4b1-1174c69e989f-kube-api-access-9qrf9\") pod \"designate-operator-controller-manager-75dfd9b554-75k2s\" (UID: \"ffa40450-8658-4d21-b4b1-1174c69e989f\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.764915 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.784716 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.791096 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qrf9\" (UniqueName: \"kubernetes.io/projected/ffa40450-8658-4d21-b4b1-1174c69e989f-kube-api-access-9qrf9\") pod \"designate-operator-controller-manager-75dfd9b554-75k2s\" (UID: \"ffa40450-8658-4d21-b4b1-1174c69e989f\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.791246 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wmmr\" (UniqueName: \"kubernetes.io/projected/f22649b0-7cea-4fb0-bc66-d1708cfa5630-kube-api-access-7wmmr\") pod \"glance-operator-controller-manager-5dc44df7d5-pfd8r\" (UID: \"f22649b0-7cea-4fb0-bc66-d1708cfa5630\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.795904 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbk8\" (UniqueName: \"kubernetes.io/projected/988dc4fe-2f1a-481a-9954-3578f833387e-kube-api-access-6vbk8\") pod \"barbican-operator-controller-manager-58c4cd55f4-p2fpt\" (UID: \"988dc4fe-2f1a-481a-9954-3578f833387e\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.797683 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.801913 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.802973 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.804829 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rllcb" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.809529 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.813010 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.814111 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.816004 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-mks7c" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.817651 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.818434 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.820968 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.821475 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vqgnl" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.825180 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.828729 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.830803 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-sh85j" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.830920 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.836289 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.836786 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.848817 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-b5n85" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.849693 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.855124 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860464 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjmz7\" (UniqueName: \"kubernetes.io/projected/254bc245-b889-4cb4-a787-a49298e93315-kube-api-access-jjmz7\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-92n2s\" (UID: \"254bc245-b889-4cb4-a787-a49298e93315\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860505 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4pn\" (UniqueName: \"kubernetes.io/projected/ecc025b9-0996-46f6-9ea5-024219d094b0-kube-api-access-cm4pn\") pod \"nova-operator-controller-manager-7c7fc454ff-jdb47\" (UID: \"ecc025b9-0996-46f6-9ea5-024219d094b0\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860532 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66qpd\" (UniqueName: \"kubernetes.io/projected/0dc6bebc-04e7-4d9f-bf07-007411e61c71-kube-api-access-66qpd\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-2tlcp\" (UID: \"0dc6bebc-04e7-4d9f-bf07-007411e61c71\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860569 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96flb\" (UniqueName: \"kubernetes.io/projected/d9b1f3d1-a00f-45df-ae19-728a8716aaa3-kube-api-access-96flb\") pod \"horizon-operator-controller-manager-76d5b87f47-crcmj\" (UID: \"d9b1f3d1-a00f-45df-ae19-728a8716aaa3\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860601 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfn4x\" (UniqueName: \"kubernetes.io/projected/ded88cfb-86e3-4bcf-875c-285b6b34776b-kube-api-access-gfn4x\") pod \"manila-operator-controller-manager-65d89cfd9f-9ch9p\" (UID: \"ded88cfb-86e3-4bcf-875c-285b6b34776b\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860636 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9zzs\" (UniqueName: \"kubernetes.io/projected/6dc83248-702c-40eb-92ce-99f686ea1bfc-kube-api-access-b9zzs\") pod \"heat-operator-controller-manager-54b4974c45-hlvw6\" (UID: \"6dc83248-702c-40eb-92ce-99f686ea1bfc\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860657 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d708eb23-1cca-4c1c-a1e5-7a68efa23a59-cert\") pod \"infra-operator-controller-manager-658588b8c9-66frw\" (UID: \"d708eb23-1cca-4c1c-a1e5-7a68efa23a59\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860677 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4bxx\" (UniqueName: \"kubernetes.io/projected/a0a63794-3c14-4704-aca7-7f259b6e9292-kube-api-access-q4bxx\") pod \"neutron-operator-controller-manager-8d984cc4d-94knm\" (UID: \"a0a63794-3c14-4704-aca7-7f259b6e9292\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860706 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tch44\" (UniqueName: \"kubernetes.io/projected/46dec033-4c2a-4fd6-87fb-a877d35e258d-kube-api-access-tch44\") pod \"ironic-operator-controller-manager-649675d675-t9bsq\" (UID: \"46dec033-4c2a-4fd6-87fb-a877d35e258d\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.860734 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b79l\" (UniqueName: \"kubernetes.io/projected/d708eb23-1cca-4c1c-a1e5-7a68efa23a59-kube-api-access-7b79l\") pod \"infra-operator-controller-manager-658588b8c9-66frw\" (UID: \"d708eb23-1cca-4c1c-a1e5-7a68efa23a59\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:15 crc kubenswrapper[5025]: E1007 08:32:15.862760 5025 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 07 08:32:15 crc kubenswrapper[5025]: E1007 08:32:15.862804 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d708eb23-1cca-4c1c-a1e5-7a68efa23a59-cert podName:d708eb23-1cca-4c1c-a1e5-7a68efa23a59 nodeName:}" failed. No retries permitted until 2025-10-07 08:32:16.362790264 +0000 UTC m=+943.172104408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d708eb23-1cca-4c1c-a1e5-7a68efa23a59-cert") pod "infra-operator-controller-manager-658588b8c9-66frw" (UID: "d708eb23-1cca-4c1c-a1e5-7a68efa23a59") : secret "infra-operator-webhook-server-cert" not found Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.865283 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.877574 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.880012 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tch44\" (UniqueName: \"kubernetes.io/projected/46dec033-4c2a-4fd6-87fb-a877d35e258d-kube-api-access-tch44\") pod \"ironic-operator-controller-manager-649675d675-t9bsq\" (UID: \"46dec033-4c2a-4fd6-87fb-a877d35e258d\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.880533 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfn4x\" (UniqueName: \"kubernetes.io/projected/ded88cfb-86e3-4bcf-875c-285b6b34776b-kube-api-access-gfn4x\") pod \"manila-operator-controller-manager-65d89cfd9f-9ch9p\" (UID: \"ded88cfb-86e3-4bcf-875c-285b6b34776b\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.880717 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b79l\" (UniqueName: \"kubernetes.io/projected/d708eb23-1cca-4c1c-a1e5-7a68efa23a59-kube-api-access-7b79l\") pod \"infra-operator-controller-manager-658588b8c9-66frw\" (UID: \"d708eb23-1cca-4c1c-a1e5-7a68efa23a59\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.880889 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9zzs\" (UniqueName: \"kubernetes.io/projected/6dc83248-702c-40eb-92ce-99f686ea1bfc-kube-api-access-b9zzs\") pod \"heat-operator-controller-manager-54b4974c45-hlvw6\" (UID: \"6dc83248-702c-40eb-92ce-99f686ea1bfc\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.881667 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96flb\" (UniqueName: \"kubernetes.io/projected/d9b1f3d1-a00f-45df-ae19-728a8716aaa3-kube-api-access-96flb\") pod \"horizon-operator-controller-manager-76d5b87f47-crcmj\" (UID: \"d9b1f3d1-a00f-45df-ae19-728a8716aaa3\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.901998 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.904994 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.906349 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.909872 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rznn6" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.910708 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.956193 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.960669 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx"] Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.960811 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962138 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdmq\" (UniqueName: \"kubernetes.io/projected/6eab8249-8174-4cfd-ab17-de2ed309f0e5-kube-api-access-2jdmq\") pod \"ovn-operator-controller-manager-6d8b6f9b9-rvd8k\" (UID: \"6eab8249-8174-4cfd-ab17-de2ed309f0e5\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962176 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfswt\" (UniqueName: \"kubernetes.io/projected/d0493142-dcb7-4291-89e2-857772df4f54-kube-api-access-rfswt\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2\" (UID: \"d0493142-dcb7-4291-89e2-857772df4f54\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962218 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2pbj\" (UniqueName: \"kubernetes.io/projected/43b5fc1d-13b9-4945-a93d-4c55036c69ca-kube-api-access-g2pbj\") pod \"octavia-operator-controller-manager-7468f855d8-mwl2b\" (UID: \"43b5fc1d-13b9-4945-a93d-4c55036c69ca\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962243 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjmz7\" (UniqueName: \"kubernetes.io/projected/254bc245-b889-4cb4-a787-a49298e93315-kube-api-access-jjmz7\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-92n2s\" (UID: \"254bc245-b889-4cb4-a787-a49298e93315\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962265 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4pn\" (UniqueName: \"kubernetes.io/projected/ecc025b9-0996-46f6-9ea5-024219d094b0-kube-api-access-cm4pn\") pod \"nova-operator-controller-manager-7c7fc454ff-jdb47\" (UID: \"ecc025b9-0996-46f6-9ea5-024219d094b0\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962287 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66qpd\" (UniqueName: \"kubernetes.io/projected/0dc6bebc-04e7-4d9f-bf07-007411e61c71-kube-api-access-66qpd\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-2tlcp\" (UID: \"0dc6bebc-04e7-4d9f-bf07-007411e61c71\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962317 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0493142-dcb7-4291-89e2-857772df4f54-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2\" (UID: \"d0493142-dcb7-4291-89e2-857772df4f54\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962344 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7nv5\" (UniqueName: \"kubernetes.io/projected/552d8865-34b1-41e2-b755-becdee67efef-kube-api-access-f7nv5\") pod \"swift-operator-controller-manager-6859f9b676-hjkhm\" (UID: \"552d8865-34b1-41e2-b755-becdee67efef\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962380 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnwdd\" (UniqueName: \"kubernetes.io/projected/c423925e-5372-4cf2-a8ce-5d864fde501e-kube-api-access-vnwdd\") pod \"placement-operator-controller-manager-54689d9f88-7dj22\" (UID: \"c423925e-5372-4cf2-a8ce-5d864fde501e\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.962400 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4bxx\" (UniqueName: \"kubernetes.io/projected/a0a63794-3c14-4704-aca7-7f259b6e9292-kube-api-access-q4bxx\") pod \"neutron-operator-controller-manager-8d984cc4d-94knm\" (UID: \"a0a63794-3c14-4704-aca7-7f259b6e9292\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.963022 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.964319 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8nthr" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.988282 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4pn\" (UniqueName: \"kubernetes.io/projected/ecc025b9-0996-46f6-9ea5-024219d094b0-kube-api-access-cm4pn\") pod \"nova-operator-controller-manager-7c7fc454ff-jdb47\" (UID: \"ecc025b9-0996-46f6-9ea5-024219d094b0\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.990049 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjmz7\" (UniqueName: \"kubernetes.io/projected/254bc245-b889-4cb4-a787-a49298e93315-kube-api-access-jjmz7\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-92n2s\" (UID: \"254bc245-b889-4cb4-a787-a49298e93315\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.991721 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66qpd\" (UniqueName: \"kubernetes.io/projected/0dc6bebc-04e7-4d9f-bf07-007411e61c71-kube-api-access-66qpd\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-2tlcp\" (UID: \"0dc6bebc-04e7-4d9f-bf07-007411e61c71\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" Oct 07 08:32:15 crc kubenswrapper[5025]: I1007 08:32:15.994996 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4bxx\" (UniqueName: \"kubernetes.io/projected/a0a63794-3c14-4704-aca7-7f259b6e9292-kube-api-access-q4bxx\") pod \"neutron-operator-controller-manager-8d984cc4d-94knm\" (UID: \"a0a63794-3c14-4704-aca7-7f259b6e9292\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.001693 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.050426 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.053159 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.056873 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.069322 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-b5kjg" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.069598 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbgws\" (UniqueName: \"kubernetes.io/projected/df2c5800-14f3-4112-a09c-31b0b75792d6-kube-api-access-lbgws\") pod \"test-operator-controller-manager-5cd5cb47d7-z52jx\" (UID: \"df2c5800-14f3-4112-a09c-31b0b75792d6\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.069634 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzzbj\" (UniqueName: \"kubernetes.io/projected/9e7ad814-5207-4f51-b499-537c23a9d8b2-kube-api-access-tzzbj\") pod \"telemetry-operator-controller-manager-5d4d74dd89-2jc2q\" (UID: \"9e7ad814-5207-4f51-b499-537c23a9d8b2\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.069665 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0493142-dcb7-4291-89e2-857772df4f54-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2\" (UID: \"d0493142-dcb7-4291-89e2-857772df4f54\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.069691 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7nv5\" (UniqueName: \"kubernetes.io/projected/552d8865-34b1-41e2-b755-becdee67efef-kube-api-access-f7nv5\") pod \"swift-operator-controller-manager-6859f9b676-hjkhm\" (UID: \"552d8865-34b1-41e2-b755-becdee67efef\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.069728 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnwdd\" (UniqueName: \"kubernetes.io/projected/c423925e-5372-4cf2-a8ce-5d864fde501e-kube-api-access-vnwdd\") pod \"placement-operator-controller-manager-54689d9f88-7dj22\" (UID: \"c423925e-5372-4cf2-a8ce-5d864fde501e\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.069761 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdmq\" (UniqueName: \"kubernetes.io/projected/6eab8249-8174-4cfd-ab17-de2ed309f0e5-kube-api-access-2jdmq\") pod \"ovn-operator-controller-manager-6d8b6f9b9-rvd8k\" (UID: \"6eab8249-8174-4cfd-ab17-de2ed309f0e5\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.069790 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfswt\" (UniqueName: \"kubernetes.io/projected/d0493142-dcb7-4291-89e2-857772df4f54-kube-api-access-rfswt\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2\" (UID: \"d0493142-dcb7-4291-89e2-857772df4f54\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.069823 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2pbj\" (UniqueName: \"kubernetes.io/projected/43b5fc1d-13b9-4945-a93d-4c55036c69ca-kube-api-access-g2pbj\") pod \"octavia-operator-controller-manager-7468f855d8-mwl2b\" (UID: \"43b5fc1d-13b9-4945-a93d-4c55036c69ca\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.070277 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" Oct 07 08:32:16 crc kubenswrapper[5025]: E1007 08:32:16.070667 5025 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 08:32:16 crc kubenswrapper[5025]: E1007 08:32:16.070702 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0493142-dcb7-4291-89e2-857772df4f54-cert podName:d0493142-dcb7-4291-89e2-857772df4f54 nodeName:}" failed. No retries permitted until 2025-10-07 08:32:16.570688833 +0000 UTC m=+943.380002977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d0493142-dcb7-4291-89e2-857772df4f54-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" (UID: "d0493142-dcb7-4291-89e2-857772df4f54") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.084356 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.093532 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2pbj\" (UniqueName: \"kubernetes.io/projected/43b5fc1d-13b9-4945-a93d-4c55036c69ca-kube-api-access-g2pbj\") pod \"octavia-operator-controller-manager-7468f855d8-mwl2b\" (UID: \"43b5fc1d-13b9-4945-a93d-4c55036c69ca\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.093870 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.095419 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfswt\" (UniqueName: \"kubernetes.io/projected/d0493142-dcb7-4291-89e2-857772df4f54-kube-api-access-rfswt\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2\" (UID: \"d0493142-dcb7-4291-89e2-857772df4f54\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.095763 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7nv5\" (UniqueName: \"kubernetes.io/projected/552d8865-34b1-41e2-b755-becdee67efef-kube-api-access-f7nv5\") pod \"swift-operator-controller-manager-6859f9b676-hjkhm\" (UID: \"552d8865-34b1-41e2-b755-becdee67efef\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.097034 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdmq\" (UniqueName: \"kubernetes.io/projected/6eab8249-8174-4cfd-ab17-de2ed309f0e5-kube-api-access-2jdmq\") pod \"ovn-operator-controller-manager-6d8b6f9b9-rvd8k\" (UID: \"6eab8249-8174-4cfd-ab17-de2ed309f0e5\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.099151 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnwdd\" (UniqueName: \"kubernetes.io/projected/c423925e-5372-4cf2-a8ce-5d864fde501e-kube-api-access-vnwdd\") pod \"placement-operator-controller-manager-54689d9f88-7dj22\" (UID: \"c423925e-5372-4cf2-a8ce-5d864fde501e\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.171225 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbgws\" (UniqueName: \"kubernetes.io/projected/df2c5800-14f3-4112-a09c-31b0b75792d6-kube-api-access-lbgws\") pod \"test-operator-controller-manager-5cd5cb47d7-z52jx\" (UID: \"df2c5800-14f3-4112-a09c-31b0b75792d6\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.171286 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzzbj\" (UniqueName: \"kubernetes.io/projected/9e7ad814-5207-4f51-b499-537c23a9d8b2-kube-api-access-tzzbj\") pod \"telemetry-operator-controller-manager-5d4d74dd89-2jc2q\" (UID: \"9e7ad814-5207-4f51-b499-537c23a9d8b2\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.171386 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hzrf\" (UniqueName: \"kubernetes.io/projected/f79855b6-658f-4526-9201-08d54f47c41d-kube-api-access-6hzrf\") pod \"watcher-operator-controller-manager-6cbc6dd547-szrb6\" (UID: \"f79855b6-658f-4526-9201-08d54f47c41d\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.202060 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.208448 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.215227 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.218228 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.223963 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.224344 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzzbj\" (UniqueName: \"kubernetes.io/projected/9e7ad814-5207-4f51-b499-537c23a9d8b2-kube-api-access-tzzbj\") pod \"telemetry-operator-controller-manager-5d4d74dd89-2jc2q\" (UID: \"9e7ad814-5207-4f51-b499-537c23a9d8b2\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.230085 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.250309 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.256569 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbgws\" (UniqueName: \"kubernetes.io/projected/df2c5800-14f3-4112-a09c-31b0b75792d6-kube-api-access-lbgws\") pod \"test-operator-controller-manager-5cd5cb47d7-z52jx\" (UID: \"df2c5800-14f3-4112-a09c-31b0b75792d6\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.257112 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mfm6n" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.257450 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.258602 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.262501 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.273746 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hzrf\" (UniqueName: \"kubernetes.io/projected/f79855b6-658f-4526-9201-08d54f47c41d-kube-api-access-6hzrf\") pod \"watcher-operator-controller-manager-6cbc6dd547-szrb6\" (UID: \"f79855b6-658f-4526-9201-08d54f47c41d\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.277857 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.287820 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.292870 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.295289 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.299075 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-57pqq" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.300163 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.330623 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.346332 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hzrf\" (UniqueName: \"kubernetes.io/projected/f79855b6-658f-4526-9201-08d54f47c41d-kube-api-access-6hzrf\") pod \"watcher-operator-controller-manager-6cbc6dd547-szrb6\" (UID: \"f79855b6-658f-4526-9201-08d54f47c41d\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.366514 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.379974 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rb5m\" (UniqueName: \"kubernetes.io/projected/c06c5dfe-d9b0-4d21-a132-79ca285655c6-kube-api-access-5rb5m\") pod \"openstack-operator-controller-manager-8b6c49794-5wfpl\" (UID: \"c06c5dfe-d9b0-4d21-a132-79ca285655c6\") " pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.380013 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d708eb23-1cca-4c1c-a1e5-7a68efa23a59-cert\") pod \"infra-operator-controller-manager-658588b8c9-66frw\" (UID: \"d708eb23-1cca-4c1c-a1e5-7a68efa23a59\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.380144 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c06c5dfe-d9b0-4d21-a132-79ca285655c6-cert\") pod \"openstack-operator-controller-manager-8b6c49794-5wfpl\" (UID: \"c06c5dfe-d9b0-4d21-a132-79ca285655c6\") " pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.380828 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c24sx\" (UniqueName: \"kubernetes.io/projected/156efee0-2e44-4494-9c13-baef0c5e45b8-kube-api-access-c24sx\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7\" (UID: \"156efee0-2e44-4494-9c13-baef0c5e45b8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.399725 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.401218 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d708eb23-1cca-4c1c-a1e5-7a68efa23a59-cert\") pod \"infra-operator-controller-manager-658588b8c9-66frw\" (UID: \"d708eb23-1cca-4c1c-a1e5-7a68efa23a59\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.425304 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.483437 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c06c5dfe-d9b0-4d21-a132-79ca285655c6-cert\") pod \"openstack-operator-controller-manager-8b6c49794-5wfpl\" (UID: \"c06c5dfe-d9b0-4d21-a132-79ca285655c6\") " pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.483503 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c24sx\" (UniqueName: \"kubernetes.io/projected/156efee0-2e44-4494-9c13-baef0c5e45b8-kube-api-access-c24sx\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7\" (UID: \"156efee0-2e44-4494-9c13-baef0c5e45b8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.483576 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rb5m\" (UniqueName: \"kubernetes.io/projected/c06c5dfe-d9b0-4d21-a132-79ca285655c6-kube-api-access-5rb5m\") pod \"openstack-operator-controller-manager-8b6c49794-5wfpl\" (UID: \"c06c5dfe-d9b0-4d21-a132-79ca285655c6\") " pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.491631 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c06c5dfe-d9b0-4d21-a132-79ca285655c6-cert\") pod \"openstack-operator-controller-manager-8b6c49794-5wfpl\" (UID: \"c06c5dfe-d9b0-4d21-a132-79ca285655c6\") " pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.499378 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c24sx\" (UniqueName: \"kubernetes.io/projected/156efee0-2e44-4494-9c13-baef0c5e45b8-kube-api-access-c24sx\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7\" (UID: \"156efee0-2e44-4494-9c13-baef0c5e45b8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.505744 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rb5m\" (UniqueName: \"kubernetes.io/projected/c06c5dfe-d9b0-4d21-a132-79ca285655c6-kube-api-access-5rb5m\") pod \"openstack-operator-controller-manager-8b6c49794-5wfpl\" (UID: \"c06c5dfe-d9b0-4d21-a132-79ca285655c6\") " pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.576598 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.583464 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.584811 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0493142-dcb7-4291-89e2-857772df4f54-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2\" (UID: \"d0493142-dcb7-4291-89e2-857772df4f54\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:16 crc kubenswrapper[5025]: E1007 08:32:16.584972 5025 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 08:32:16 crc kubenswrapper[5025]: E1007 08:32:16.585029 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0493142-dcb7-4291-89e2-857772df4f54-cert podName:d0493142-dcb7-4291-89e2-857772df4f54 nodeName:}" failed. No retries permitted until 2025-10-07 08:32:17.585013497 +0000 UTC m=+944.394327641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d0493142-dcb7-4291-89e2-857772df4f54-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" (UID: "d0493142-dcb7-4291-89e2-857772df4f54") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.605949 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.658932 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.863553 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj"] Oct 07 08:32:16 crc kubenswrapper[5025]: I1007 08:32:16.872593 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6"] Oct 07 08:32:16 crc kubenswrapper[5025]: W1007 08:32:16.955507 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dc83248_702c_40eb_92ce_99f686ea1bfc.slice/crio-70da469aef96cd8333c558b563253bc1dff48f308710aa3604672ccf9bdb05af WatchSource:0}: Error finding container 70da469aef96cd8333c558b563253bc1dff48f308710aa3604672ccf9bdb05af: Status 404 returned error can't find the container with id 70da469aef96cd8333c558b563253bc1dff48f308710aa3604672ccf9bdb05af Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.038166 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" event={"ID":"6dc83248-702c-40eb-92ce-99f686ea1bfc","Type":"ContainerStarted","Data":"70da469aef96cd8333c558b563253bc1dff48f308710aa3604672ccf9bdb05af"} Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.043194 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" event={"ID":"f22649b0-7cea-4fb0-bc66-d1708cfa5630","Type":"ContainerStarted","Data":"b2a4e3ec42ae193cc297fffff53b83a1927d972f6302c78164c4592422b33d9a"} Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.046429 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" event={"ID":"c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1","Type":"ContainerStarted","Data":"492cdf78f36688df2f14e7e4e2fc399073690d8f3575fd5b0b4ba8c251d5f597"} Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.047879 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" event={"ID":"ffa40450-8658-4d21-b4b1-1174c69e989f","Type":"ContainerStarted","Data":"63925696a6ec6f9109246538f007ed42a533b7842ccf8df23f4c72edf76eead2"} Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.049375 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" event={"ID":"d9b1f3d1-a00f-45df-ae19-728a8716aaa3","Type":"ContainerStarted","Data":"2a34add80190d185fd6dd8d5c09ea2a2e0e446597435138c922a1df4325dd563"} Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.214240 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.220779 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq"] Oct 07 08:32:17 crc kubenswrapper[5025]: W1007 08:32:17.222768 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podded88cfb_86e3_4bcf_875c_285b6b34776b.slice/crio-532b9a4a26a06f2968eaf91968b330670f8a9aeb3df26d7b5c71c6ed5415207e WatchSource:0}: Error finding container 532b9a4a26a06f2968eaf91968b330670f8a9aeb3df26d7b5c71c6ed5415207e: Status 404 returned error can't find the container with id 532b9a4a26a06f2968eaf91968b330670f8a9aeb3df26d7b5c71c6ed5415207e Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.225199 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt"] Oct 07 08:32:17 crc kubenswrapper[5025]: W1007 08:32:17.225737 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod988dc4fe_2f1a_481a_9954_3578f833387e.slice/crio-4bf5d3096eac43590fe6f27efb863d9811f2e64d1fb6fbb606d128a7a6fc1b87 WatchSource:0}: Error finding container 4bf5d3096eac43590fe6f27efb863d9811f2e64d1fb6fbb606d128a7a6fc1b87: Status 404 returned error can't find the container with id 4bf5d3096eac43590fe6f27efb863d9811f2e64d1fb6fbb606d128a7a6fc1b87 Oct 07 08:32:17 crc kubenswrapper[5025]: W1007 08:32:17.227864 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46dec033_4c2a_4fd6_87fb_a877d35e258d.slice/crio-a98380dae82a28e5725583a87d7b61e427c3a2a270eb0e34300bc045a9da4a9d WatchSource:0}: Error finding container a98380dae82a28e5725583a87d7b61e427c3a2a270eb0e34300bc045a9da4a9d: Status 404 returned error can't find the container with id a98380dae82a28e5725583a87d7b61e427c3a2a270eb0e34300bc045a9da4a9d Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.598169 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.601682 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0493142-dcb7-4291-89e2-857772df4f54-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2\" (UID: \"d0493142-dcb7-4291-89e2-857772df4f54\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.614741 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0493142-dcb7-4291-89e2-857772df4f54-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2\" (UID: \"d0493142-dcb7-4291-89e2-857772df4f54\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.616965 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.647045 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.649089 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.657808 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.678314 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.698485 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm"] Oct 07 08:32:17 crc kubenswrapper[5025]: E1007 08:32:17.702946 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jjmz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6cd6d7bdf5-92n2s_openstack-operators(254bc245-b889-4cb4-a787-a49298e93315): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 08:32:17 crc kubenswrapper[5025]: E1007 08:32:17.704421 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c24sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7_openstack-operators(156efee0-2e44-4494-9c13-baef0c5e45b8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.704470 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx"] Oct 07 08:32:17 crc kubenswrapper[5025]: E1007 08:32:17.705614 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" podUID="156efee0-2e44-4494-9c13-baef0c5e45b8" Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.711659 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.712841 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-66frw"] Oct 07 08:32:17 crc kubenswrapper[5025]: E1007 08:32:17.721186 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7b79l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-66frw_openstack-operators(d708eb23-1cca-4c1c-a1e5-7a68efa23a59): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.724352 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl"] Oct 07 08:32:17 crc kubenswrapper[5025]: E1007 08:32:17.731429 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vnwdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-54689d9f88-7dj22_openstack-operators(c423925e-5372-4cf2-a8ce-5d864fde501e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.739074 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.763879 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.791585 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm"] Oct 07 08:32:17 crc kubenswrapper[5025]: I1007 08:32:17.797466 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22"] Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.058359 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" event={"ID":"156efee0-2e44-4494-9c13-baef0c5e45b8","Type":"ContainerStarted","Data":"a077bd7f4e0fb4aa5fca521d86e941843effdf1c2b96a71c8b11c7bacaa6a966"} Oct 07 08:32:18 crc kubenswrapper[5025]: E1007 08:32:18.059912 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" podUID="156efee0-2e44-4494-9c13-baef0c5e45b8" Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.060370 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" event={"ID":"552d8865-34b1-41e2-b755-becdee67efef","Type":"ContainerStarted","Data":"da50af660e11c3216b81ac0e13dd3f3a174095e50b490baf17db5b95a5778929"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.061345 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" event={"ID":"a0a63794-3c14-4704-aca7-7f259b6e9292","Type":"ContainerStarted","Data":"2ef2c12a5c578500255ce5cb069713ececd35bf4ce2b27fa44f486d63f158f75"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.070981 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" event={"ID":"43b5fc1d-13b9-4945-a93d-4c55036c69ca","Type":"ContainerStarted","Data":"ec85f22dfc3061a31a1da186df350a0f6013eba2df4105419c692db14c96db61"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.074091 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" event={"ID":"df2c5800-14f3-4112-a09c-31b0b75792d6","Type":"ContainerStarted","Data":"7b016c71c4203c3e8839b918cec02f1ab7e3e2708985b4a0050c09f5ea44d61b"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.077256 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" event={"ID":"46dec033-4c2a-4fd6-87fb-a877d35e258d","Type":"ContainerStarted","Data":"a98380dae82a28e5725583a87d7b61e427c3a2a270eb0e34300bc045a9da4a9d"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.082732 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" event={"ID":"6eab8249-8174-4cfd-ab17-de2ed309f0e5","Type":"ContainerStarted","Data":"abec865d0a35bf4f72cc267871ef0bc21b77d73a9a8ea2625c39b79c4fa0df95"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.085437 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" event={"ID":"f79855b6-658f-4526-9201-08d54f47c41d","Type":"ContainerStarted","Data":"489b830f2add4ae67bd6dc014909e63f4c863cbf4b406bef5d7ec57ed45d914c"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.088624 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" event={"ID":"d708eb23-1cca-4c1c-a1e5-7a68efa23a59","Type":"ContainerStarted","Data":"db00a8246b4ba0786898bf921d6d2ed55632e993a4f37dd3f92c82fecb39b5f8"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.090514 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" event={"ID":"ded88cfb-86e3-4bcf-875c-285b6b34776b","Type":"ContainerStarted","Data":"532b9a4a26a06f2968eaf91968b330670f8a9aeb3df26d7b5c71c6ed5415207e"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.091814 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" event={"ID":"c06c5dfe-d9b0-4d21-a132-79ca285655c6","Type":"ContainerStarted","Data":"9077e2741ec901f158b77df6ee7c9dbb95cee026ac3ca4caf92ad073534b503a"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.095060 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" event={"ID":"c423925e-5372-4cf2-a8ce-5d864fde501e","Type":"ContainerStarted","Data":"f307b5807c36bb2b401d2a90cb64c1cdb56e5f81f557c213dac68b860864b0cc"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.096999 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" event={"ID":"0dc6bebc-04e7-4d9f-bf07-007411e61c71","Type":"ContainerStarted","Data":"c96492913ff147c4f23a6b695b4dae4c58fd69108d4c5739a34a5d3782733be9"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.098246 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" event={"ID":"988dc4fe-2f1a-481a-9954-3578f833387e","Type":"ContainerStarted","Data":"4bf5d3096eac43590fe6f27efb863d9811f2e64d1fb6fbb606d128a7a6fc1b87"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.100017 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" event={"ID":"9e7ad814-5207-4f51-b499-537c23a9d8b2","Type":"ContainerStarted","Data":"870a44c6aa564002bf147b0608b317764586846548ca95b8743556c0f2a8fef6"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.101797 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" event={"ID":"ecc025b9-0996-46f6-9ea5-024219d094b0","Type":"ContainerStarted","Data":"a2cd5227816ba9ade6ac8e1aefb8a611d94050fec4832bc8fe008a5298d4b263"} Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.104121 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" event={"ID":"254bc245-b889-4cb4-a787-a49298e93315","Type":"ContainerStarted","Data":"4d44f6c2393d85a418eb104d9b6311ddc2b387740c4d79d2dcf8d095ca05f374"} Oct 07 08:32:18 crc kubenswrapper[5025]: E1007 08:32:18.110480 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" podUID="d708eb23-1cca-4c1c-a1e5-7a68efa23a59" Oct 07 08:32:18 crc kubenswrapper[5025]: E1007 08:32:18.114083 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" podUID="254bc245-b889-4cb4-a787-a49298e93315" Oct 07 08:32:18 crc kubenswrapper[5025]: E1007 08:32:18.148198 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" podUID="c423925e-5372-4cf2-a8ce-5d864fde501e" Oct 07 08:32:18 crc kubenswrapper[5025]: I1007 08:32:18.608027 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2"] Oct 07 08:32:19 crc kubenswrapper[5025]: I1007 08:32:19.144774 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" event={"ID":"254bc245-b889-4cb4-a787-a49298e93315","Type":"ContainerStarted","Data":"adc79b3322c15f8c13f9815f04f471959f72a8854706a760e3bcff97ff29bc14"} Oct 07 08:32:19 crc kubenswrapper[5025]: E1007 08:32:19.147253 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" podUID="254bc245-b889-4cb4-a787-a49298e93315" Oct 07 08:32:19 crc kubenswrapper[5025]: I1007 08:32:19.150812 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" event={"ID":"c423925e-5372-4cf2-a8ce-5d864fde501e","Type":"ContainerStarted","Data":"376958f601609a530a4d91abf86a5cfa6510906685eeec04149596fc6f3ee0bd"} Oct 07 08:32:19 crc kubenswrapper[5025]: E1007 08:32:19.152738 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" podUID="c423925e-5372-4cf2-a8ce-5d864fde501e" Oct 07 08:32:19 crc kubenswrapper[5025]: I1007 08:32:19.152923 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" event={"ID":"d0493142-dcb7-4291-89e2-857772df4f54","Type":"ContainerStarted","Data":"6ec5213087a5f3f45a93901634bb1628b41f1acc51ce94eff7013f7b9747d531"} Oct 07 08:32:19 crc kubenswrapper[5025]: I1007 08:32:19.159168 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" event={"ID":"d708eb23-1cca-4c1c-a1e5-7a68efa23a59","Type":"ContainerStarted","Data":"85c3e9670bce4564a53501f7cc655556a7e4f123e111fb344524125515387542"} Oct 07 08:32:19 crc kubenswrapper[5025]: E1007 08:32:19.161023 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" podUID="d708eb23-1cca-4c1c-a1e5-7a68efa23a59" Oct 07 08:32:19 crc kubenswrapper[5025]: I1007 08:32:19.171164 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" event={"ID":"c06c5dfe-d9b0-4d21-a132-79ca285655c6","Type":"ContainerStarted","Data":"505b550845d20b107c2a24f96e233fbd36b18fee63650a3806ce523eee5d5e6e"} Oct 07 08:32:19 crc kubenswrapper[5025]: I1007 08:32:19.171225 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" event={"ID":"c06c5dfe-d9b0-4d21-a132-79ca285655c6","Type":"ContainerStarted","Data":"f1f2e958cfaed033c0172c5116469bef2ab2d5d53b751ba7f8d1de475fa77dfa"} Oct 07 08:32:19 crc kubenswrapper[5025]: I1007 08:32:19.171355 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:19 crc kubenswrapper[5025]: E1007 08:32:19.180917 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" podUID="156efee0-2e44-4494-9c13-baef0c5e45b8" Oct 07 08:32:19 crc kubenswrapper[5025]: I1007 08:32:19.259513 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" podStartSLOduration=3.259444452 podStartE2EDuration="3.259444452s" podCreationTimestamp="2025-10-07 08:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:32:19.258995607 +0000 UTC m=+946.068309751" watchObservedRunningTime="2025-10-07 08:32:19.259444452 +0000 UTC m=+946.068758596" Oct 07 08:32:20 crc kubenswrapper[5025]: E1007 08:32:20.182401 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" podUID="d708eb23-1cca-4c1c-a1e5-7a68efa23a59" Oct 07 08:32:20 crc kubenswrapper[5025]: E1007 08:32:20.183220 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" podUID="c423925e-5372-4cf2-a8ce-5d864fde501e" Oct 07 08:32:20 crc kubenswrapper[5025]: E1007 08:32:20.183677 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" podUID="254bc245-b889-4cb4-a787-a49298e93315" Oct 07 08:32:26 crc kubenswrapper[5025]: I1007 08:32:26.613424 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-8b6c49794-5wfpl" Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.276093 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" event={"ID":"a0a63794-3c14-4704-aca7-7f259b6e9292","Type":"ContainerStarted","Data":"4b5f2558256c5c355fa5b58c49d05c869b43e82f1f95994e9231323b355b8b34"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.283923 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" event={"ID":"f79855b6-658f-4526-9201-08d54f47c41d","Type":"ContainerStarted","Data":"e8c2f761821221ef6bd3e8cfc659f87d945c1337c55765e3a85a687cabc03a8a"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.297973 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" event={"ID":"988dc4fe-2f1a-481a-9954-3578f833387e","Type":"ContainerStarted","Data":"9f7ad133a0a6ed0486f6ac4e649e5512d97b9d92f5fab3dfd48efeadd762755b"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.302707 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" event={"ID":"ded88cfb-86e3-4bcf-875c-285b6b34776b","Type":"ContainerStarted","Data":"8faf851fb4b79c10216cd937dabcbb906cd8c00070c895288ab412f8ce145ec1"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.317282 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" event={"ID":"df2c5800-14f3-4112-a09c-31b0b75792d6","Type":"ContainerStarted","Data":"0d2f0949ed47469a0726913886c6aae3a285739adb7446933789ee6fdd765937"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.320203 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" event={"ID":"0dc6bebc-04e7-4d9f-bf07-007411e61c71","Type":"ContainerStarted","Data":"7428478fa373bf7e8bad1ea7bb189449c78b19e81e40fe1045a28ebd9d18efad"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.326166 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" event={"ID":"9e7ad814-5207-4f51-b499-537c23a9d8b2","Type":"ContainerStarted","Data":"b89a9deeef8abe2953797d90383b545abe1be30d42b5bd68ce5ebf9aa59ff296"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.341256 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" event={"ID":"c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1","Type":"ContainerStarted","Data":"2bf24c7b787b98e795e4862e031ea2587d2a58a447d3f7fb552dc189211ffcb1"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.352190 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" event={"ID":"46dec033-4c2a-4fd6-87fb-a877d35e258d","Type":"ContainerStarted","Data":"911553c0fcdeeb1443f90bd22c2cea8ac01141d2e253d8ea221f0de5a0146a35"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.372031 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" event={"ID":"6dc83248-702c-40eb-92ce-99f686ea1bfc","Type":"ContainerStarted","Data":"ee629b35846fe2514f6e7965747bbd07efd39fb5c76ab154d7eb963847d12b55"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.380955 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" event={"ID":"43b5fc1d-13b9-4945-a93d-4c55036c69ca","Type":"ContainerStarted","Data":"85c872b3aa676ed9aabaa02f5c5aa361bebe585f9cf0a12dcd3b1b6d16867223"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.397722 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" event={"ID":"ffa40450-8658-4d21-b4b1-1174c69e989f","Type":"ContainerStarted","Data":"d8bc2cc6bd6710795b80ec67cde83d966d0f40644812497e8692d3c8aad8c75e"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.409566 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" event={"ID":"d9b1f3d1-a00f-45df-ae19-728a8716aaa3","Type":"ContainerStarted","Data":"6863060ffd8f320419e05d8f85e3308e75e47f60f12718bceef6eb35ee1b5f79"} Oct 07 08:32:31 crc kubenswrapper[5025]: I1007 08:32:31.428678 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" event={"ID":"ecc025b9-0996-46f6-9ea5-024219d094b0","Type":"ContainerStarted","Data":"ae8b53e35978370e6a41c7369214638347e34b05a6da6ead71cd297b011bf9fc"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.441763 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" event={"ID":"43b5fc1d-13b9-4945-a93d-4c55036c69ca","Type":"ContainerStarted","Data":"a337a915d6c39bf64988bd59207a8986fe3c6b8c36560354ef228965fd9745ab"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.442664 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.449365 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" event={"ID":"df2c5800-14f3-4112-a09c-31b0b75792d6","Type":"ContainerStarted","Data":"ba4e888e3cbc25e40b1351da53461add8f3dbffebc2570bc689afee510d6e137"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.449438 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.451072 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" event={"ID":"f22649b0-7cea-4fb0-bc66-d1708cfa5630","Type":"ContainerStarted","Data":"7e5fc080414cabf9dbdb1b7533fc92da6bc706d2d62e5e574eee69f355583e87"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.451097 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" event={"ID":"f22649b0-7cea-4fb0-bc66-d1708cfa5630","Type":"ContainerStarted","Data":"f99ac0c83d1ca7e511fb99074e078443db3552d29faa635b8f21adaf48427c41"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.451434 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.453573 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" event={"ID":"d9b1f3d1-a00f-45df-ae19-728a8716aaa3","Type":"ContainerStarted","Data":"11fb565e790258e2e91d0487d7bd7349ed8b0c309adb9b764ee3b2f0d847da96"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.453927 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.461624 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" event={"ID":"6dc83248-702c-40eb-92ce-99f686ea1bfc","Type":"ContainerStarted","Data":"b1faa25661e12af43cc692f31968bb6fd9152abe4e967aecc9b4bf8d036d16dd"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.461795 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.478993 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" event={"ID":"a0a63794-3c14-4704-aca7-7f259b6e9292","Type":"ContainerStarted","Data":"28cad54b8c1bca1599851357b52c3d4491f5c1510b591aee5b403802b7ead311"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.479119 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.483889 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" event={"ID":"0dc6bebc-04e7-4d9f-bf07-007411e61c71","Type":"ContainerStarted","Data":"ef60bb09303b9b969279b82b0a6641c06faeb4dc489dd6ef0b63205699b633fa"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.484251 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.486691 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" event={"ID":"988dc4fe-2f1a-481a-9954-3578f833387e","Type":"ContainerStarted","Data":"40d91eb3a94c88407693590a862270343f2db77d0259864b28a08072c80f8db9"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.487030 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.488255 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" event={"ID":"ecc025b9-0996-46f6-9ea5-024219d094b0","Type":"ContainerStarted","Data":"eacafd6c0bf1c1f4eff67860eae2e4e5fbf2fb436aae743d0e5b92cf33089a53"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.488610 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.490407 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" podStartSLOduration=4.675349283 podStartE2EDuration="17.490398609s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.690459765 +0000 UTC m=+944.499773909" lastFinishedPulling="2025-10-07 08:32:30.505509091 +0000 UTC m=+957.314823235" observedRunningTime="2025-10-07 08:32:32.460986046 +0000 UTC m=+959.270300190" watchObservedRunningTime="2025-10-07 08:32:32.490398609 +0000 UTC m=+959.299712753" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.492928 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" podStartSLOduration=3.592466294 podStartE2EDuration="17.492921019s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:16.622151983 +0000 UTC m=+943.431466127" lastFinishedPulling="2025-10-07 08:32:30.522606708 +0000 UTC m=+957.331920852" observedRunningTime="2025-10-07 08:32:32.488758738 +0000 UTC m=+959.298072882" watchObservedRunningTime="2025-10-07 08:32:32.492921019 +0000 UTC m=+959.302235163" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.501488 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" event={"ID":"d0493142-dcb7-4291-89e2-857772df4f54","Type":"ContainerStarted","Data":"8dffe204c0ecacd2f11f63644acda135d37e3177966227e15cf900c64abb503b"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.501522 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" event={"ID":"d0493142-dcb7-4291-89e2-857772df4f54","Type":"ContainerStarted","Data":"f4f26730013fd6e3ed3e4c1cccdaa81775fc7aa64c726abf3e86e9e5ea398a2a"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.501666 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.506670 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" podStartSLOduration=4.766076083 podStartE2EDuration="17.50665432s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.700016745 +0000 UTC m=+944.509330889" lastFinishedPulling="2025-10-07 08:32:30.440594982 +0000 UTC m=+957.249909126" observedRunningTime="2025-10-07 08:32:32.505023888 +0000 UTC m=+959.314338032" watchObservedRunningTime="2025-10-07 08:32:32.50665432 +0000 UTC m=+959.315968454" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.512736 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" event={"ID":"552d8865-34b1-41e2-b755-becdee67efef","Type":"ContainerStarted","Data":"9e59ec9b83671059429b45cfe66556df88ee31e5a32710218fbd18ba56fd9043"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.512774 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" event={"ID":"552d8865-34b1-41e2-b755-becdee67efef","Type":"ContainerStarted","Data":"5fce3dff791fbb91e2fb3da584fae8563981b4e6f90178ac67d79df55eb39417"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.513295 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.521456 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" event={"ID":"9e7ad814-5207-4f51-b499-537c23a9d8b2","Type":"ContainerStarted","Data":"43270226616a9d7a8cb1838de9976fd4b16d8337d1bbc4dd38429b0778cdc518"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.521985 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.526969 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" podStartSLOduration=4.017776431 podStartE2EDuration="17.526954747s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:16.902750396 +0000 UTC m=+943.712064540" lastFinishedPulling="2025-10-07 08:32:30.411928712 +0000 UTC m=+957.221242856" observedRunningTime="2025-10-07 08:32:32.52224977 +0000 UTC m=+959.331563904" watchObservedRunningTime="2025-10-07 08:32:32.526954747 +0000 UTC m=+959.336268881" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.531460 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" event={"ID":"c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1","Type":"ContainerStarted","Data":"c2c2ddec48b889d4a111dcc986f478d7b13a10ab45517577e79452d28d3720cf"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.532083 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.535829 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" event={"ID":"46dec033-4c2a-4fd6-87fb-a877d35e258d","Type":"ContainerStarted","Data":"57624debea35ef92abf4204544c941d7ae9605dfa8be32215cbf4bdc3cea35d5"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.535913 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.537339 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" event={"ID":"ffa40450-8658-4d21-b4b1-1174c69e989f","Type":"ContainerStarted","Data":"549f2e97e9495c675bf429cfcd2210d288f181df54155cd276a35ac9033fa412"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.537968 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.539205 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" event={"ID":"6eab8249-8174-4cfd-ab17-de2ed309f0e5","Type":"ContainerStarted","Data":"99a75e79cd16a4c42dca05ec17e0f82e34891071cc31eb1be45c0be3e0ff74f7"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.539229 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" event={"ID":"6eab8249-8174-4cfd-ab17-de2ed309f0e5","Type":"ContainerStarted","Data":"058d4c43305f49b62a05935da35728c17cc8265eb0c1547a3e58eee90cac0133"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.539565 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.546029 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" event={"ID":"f79855b6-658f-4526-9201-08d54f47c41d","Type":"ContainerStarted","Data":"6863dbc576674df7c0f329242bd19a28893f8b5d1c0e7df7884f20433be6a26a"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.546881 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.547495 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" podStartSLOduration=4.773725882 podStartE2EDuration="17.547479342s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.678109737 +0000 UTC m=+944.487423881" lastFinishedPulling="2025-10-07 08:32:30.451863187 +0000 UTC m=+957.261177341" observedRunningTime="2025-10-07 08:32:32.54043411 +0000 UTC m=+959.349748254" watchObservedRunningTime="2025-10-07 08:32:32.547479342 +0000 UTC m=+959.356793486" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.552661 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" event={"ID":"ded88cfb-86e3-4bcf-875c-285b6b34776b","Type":"ContainerStarted","Data":"7e6be11fb37f3559400c70123b8e1605127df5136cb4fffd7373c2cb3be525d7"} Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.553155 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.558518 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" podStartSLOduration=4.745409824 podStartE2EDuration="17.558503789s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.699030464 +0000 UTC m=+944.508344608" lastFinishedPulling="2025-10-07 08:32:30.512124429 +0000 UTC m=+957.321438573" observedRunningTime="2025-10-07 08:32:32.557607141 +0000 UTC m=+959.366921285" watchObservedRunningTime="2025-10-07 08:32:32.558503789 +0000 UTC m=+959.367817933" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.600170 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" podStartSLOduration=4.833125248 podStartE2EDuration="17.600147776s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.684823518 +0000 UTC m=+944.494137662" lastFinishedPulling="2025-10-07 08:32:30.451846046 +0000 UTC m=+957.261160190" observedRunningTime="2025-10-07 08:32:32.583366019 +0000 UTC m=+959.392680173" watchObservedRunningTime="2025-10-07 08:32:32.600147776 +0000 UTC m=+959.409461920" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.623063 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" podStartSLOduration=4.346837716 podStartE2EDuration="17.623044635s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.228102254 +0000 UTC m=+944.037416398" lastFinishedPulling="2025-10-07 08:32:30.504309183 +0000 UTC m=+957.313623317" observedRunningTime="2025-10-07 08:32:32.619097911 +0000 UTC m=+959.428412055" watchObservedRunningTime="2025-10-07 08:32:32.623044635 +0000 UTC m=+959.432358779" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.655116 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" podStartSLOduration=4.948874394 podStartE2EDuration="17.655098922s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.816196444 +0000 UTC m=+944.625510588" lastFinishedPulling="2025-10-07 08:32:30.522420972 +0000 UTC m=+957.331735116" observedRunningTime="2025-10-07 08:32:32.651082196 +0000 UTC m=+959.460396340" watchObservedRunningTime="2025-10-07 08:32:32.655098922 +0000 UTC m=+959.464413066" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.685719 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" podStartSLOduration=5.767107252 podStartE2EDuration="17.685701583s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:18.631205741 +0000 UTC m=+945.440519885" lastFinishedPulling="2025-10-07 08:32:30.549800072 +0000 UTC m=+957.359114216" observedRunningTime="2025-10-07 08:32:32.681771629 +0000 UTC m=+959.491085773" watchObservedRunningTime="2025-10-07 08:32:32.685701583 +0000 UTC m=+959.495015727" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.704577 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" podStartSLOduration=4.161761964 podStartE2EDuration="17.704559885s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:16.96208599 +0000 UTC m=+943.771400134" lastFinishedPulling="2025-10-07 08:32:30.504883911 +0000 UTC m=+957.314198055" observedRunningTime="2025-10-07 08:32:32.698714632 +0000 UTC m=+959.508028776" watchObservedRunningTime="2025-10-07 08:32:32.704559885 +0000 UTC m=+959.513874029" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.747995 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" podStartSLOduration=4.909689134 podStartE2EDuration="17.747978819s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.684213599 +0000 UTC m=+944.493527743" lastFinishedPulling="2025-10-07 08:32:30.522503284 +0000 UTC m=+957.331817428" observedRunningTime="2025-10-07 08:32:32.724937966 +0000 UTC m=+959.534252100" watchObservedRunningTime="2025-10-07 08:32:32.747978819 +0000 UTC m=+959.557292963" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.749702 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" podStartSLOduration=4.009308812 podStartE2EDuration="16.749694033s" podCreationTimestamp="2025-10-07 08:32:16 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.677786127 +0000 UTC m=+944.487100271" lastFinishedPulling="2025-10-07 08:32:30.418171348 +0000 UTC m=+957.227485492" observedRunningTime="2025-10-07 08:32:32.74514734 +0000 UTC m=+959.554461474" watchObservedRunningTime="2025-10-07 08:32:32.749694033 +0000 UTC m=+959.559008187" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.761621 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" podStartSLOduration=4.482742184 podStartE2EDuration="17.761604517s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.22638682 +0000 UTC m=+944.035700964" lastFinishedPulling="2025-10-07 08:32:30.505249153 +0000 UTC m=+957.314563297" observedRunningTime="2025-10-07 08:32:32.760259555 +0000 UTC m=+959.569573699" watchObservedRunningTime="2025-10-07 08:32:32.761604517 +0000 UTC m=+959.570918661" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.786591 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" podStartSLOduration=4.700306148 podStartE2EDuration="17.786567321s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:16.377665365 +0000 UTC m=+943.186979509" lastFinishedPulling="2025-10-07 08:32:29.463926538 +0000 UTC m=+956.273240682" observedRunningTime="2025-10-07 08:32:32.781990267 +0000 UTC m=+959.591304411" watchObservedRunningTime="2025-10-07 08:32:32.786567321 +0000 UTC m=+959.595881465" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.802921 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" podStartSLOduration=3.926445422 podStartE2EDuration="17.802907714s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:16.565199194 +0000 UTC m=+943.374513338" lastFinishedPulling="2025-10-07 08:32:30.441661496 +0000 UTC m=+957.250975630" observedRunningTime="2025-10-07 08:32:32.800479938 +0000 UTC m=+959.609794082" watchObservedRunningTime="2025-10-07 08:32:32.802907714 +0000 UTC m=+959.612221858" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.840238 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" podStartSLOduration=4.624483767 podStartE2EDuration="17.840219906s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.230663865 +0000 UTC m=+944.039978009" lastFinishedPulling="2025-10-07 08:32:30.446399984 +0000 UTC m=+957.255714148" observedRunningTime="2025-10-07 08:32:32.833241297 +0000 UTC m=+959.642555441" watchObservedRunningTime="2025-10-07 08:32:32.840219906 +0000 UTC m=+959.649534050" Oct 07 08:32:32 crc kubenswrapper[5025]: I1007 08:32:32.840440 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" podStartSLOduration=5.010313054 podStartE2EDuration="17.840432893s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.69093892 +0000 UTC m=+944.500253064" lastFinishedPulling="2025-10-07 08:32:30.521058759 +0000 UTC m=+957.330372903" observedRunningTime="2025-10-07 08:32:32.819365921 +0000 UTC m=+959.628680065" watchObservedRunningTime="2025-10-07 08:32:32.840432893 +0000 UTC m=+959.649747037" Oct 07 08:32:34 crc kubenswrapper[5025]: I1007 08:32:34.575511 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" event={"ID":"156efee0-2e44-4494-9c13-baef0c5e45b8","Type":"ContainerStarted","Data":"a6162536bdb60b29a7d70bf32cb1c6337919392f6c5cbb63a78414bf2e40b1be"} Oct 07 08:32:34 crc kubenswrapper[5025]: I1007 08:32:34.579212 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" event={"ID":"d708eb23-1cca-4c1c-a1e5-7a68efa23a59","Type":"ContainerStarted","Data":"27520ddf2ebc5c81b838393c90c0d64754bb1f553433acd52e507d9a563f674e"} Oct 07 08:32:34 crc kubenswrapper[5025]: I1007 08:32:34.605153 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7" podStartSLOduration=2.336407483 podStartE2EDuration="18.605133466s" podCreationTimestamp="2025-10-07 08:32:16 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.704330951 +0000 UTC m=+944.513645095" lastFinishedPulling="2025-10-07 08:32:33.973056944 +0000 UTC m=+960.782371078" observedRunningTime="2025-10-07 08:32:34.602331678 +0000 UTC m=+961.411645822" watchObservedRunningTime="2025-10-07 08:32:34.605133466 +0000 UTC m=+961.414447610" Oct 07 08:32:34 crc kubenswrapper[5025]: I1007 08:32:34.623279 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" podStartSLOduration=3.369187593 podStartE2EDuration="19.623260866s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.721017995 +0000 UTC m=+944.530332139" lastFinishedPulling="2025-10-07 08:32:33.975091268 +0000 UTC m=+960.784405412" observedRunningTime="2025-10-07 08:32:34.618901579 +0000 UTC m=+961.428215723" watchObservedRunningTime="2025-10-07 08:32:34.623260866 +0000 UTC m=+961.432575010" Oct 07 08:32:35 crc kubenswrapper[5025]: I1007 08:32:35.787889 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-7tfgz" Oct 07 08:32:35 crc kubenswrapper[5025]: I1007 08:32:35.904632 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hlvw6" Oct 07 08:32:35 crc kubenswrapper[5025]: I1007 08:32:35.965325 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-crcmj" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.007948 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-649675d675-t9bsq" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.072863 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-9ch9p" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.094940 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-2tlcp" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.097849 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-p2fpt" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.220699 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-jdb47" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.221764 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-94knm" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.226372 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-mwl2b" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.264377 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-rvd8k" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.267318 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-hjkhm" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.302807 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-2jc2q" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.321084 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z52jx" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.407057 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-szrb6" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.584461 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.616348 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" event={"ID":"254bc245-b889-4cb4-a787-a49298e93315","Type":"ContainerStarted","Data":"379ec1f18648fe4f9b15007ceb6751dbad59d9703445e324a8745020ee6ebe2b"} Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.617107 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.620822 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" event={"ID":"c423925e-5372-4cf2-a8ce-5d864fde501e","Type":"ContainerStarted","Data":"32a05b7c0c151bcbfbd53cec455f028033df230b10dac9fbfd186d08a2cc1667"} Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.621027 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.631811 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" podStartSLOduration=3.389391087 podStartE2EDuration="21.631797987s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.702816603 +0000 UTC m=+944.512130747" lastFinishedPulling="2025-10-07 08:32:35.945223493 +0000 UTC m=+962.754537647" observedRunningTime="2025-10-07 08:32:36.630059951 +0000 UTC m=+963.439374095" watchObservedRunningTime="2025-10-07 08:32:36.631797987 +0000 UTC m=+963.441112131" Oct 07 08:32:36 crc kubenswrapper[5025]: I1007 08:32:36.643461 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" podStartSLOduration=3.436253518 podStartE2EDuration="21.643448732s" podCreationTimestamp="2025-10-07 08:32:15 +0000 UTC" firstStartedPulling="2025-10-07 08:32:17.731317018 +0000 UTC m=+944.540631162" lastFinishedPulling="2025-10-07 08:32:35.938512222 +0000 UTC m=+962.747826376" observedRunningTime="2025-10-07 08:32:36.642561834 +0000 UTC m=+963.451875998" watchObservedRunningTime="2025-10-07 08:32:36.643448732 +0000 UTC m=+963.452762876" Oct 07 08:32:37 crc kubenswrapper[5025]: I1007 08:32:37.776511 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2" Oct 07 08:32:45 crc kubenswrapper[5025]: I1007 08:32:45.812008 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-75k2s" Oct 07 08:32:45 crc kubenswrapper[5025]: I1007 08:32:45.855747 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-pfd8r" Oct 07 08:32:46 crc kubenswrapper[5025]: I1007 08:32:46.212649 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-92n2s" Oct 07 08:32:46 crc kubenswrapper[5025]: I1007 08:32:46.233692 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-7dj22" Oct 07 08:32:46 crc kubenswrapper[5025]: I1007 08:32:46.591369 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-66frw" Oct 07 08:32:55 crc kubenswrapper[5025]: I1007 08:32:55.934137 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:32:55 crc kubenswrapper[5025]: I1007 08:32:55.934831 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.175328 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bdh62"] Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.177391 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.179790 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.180150 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.180928 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bml24" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.182880 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.190449 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bdh62"] Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.232155 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dzm4t"] Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.233374 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.235813 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.249990 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dzm4t"] Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.292955 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3afe0ea-9725-42c0-afff-9f709579538b-config\") pod \"dnsmasq-dns-675f4bcbfc-bdh62\" (UID: \"e3afe0ea-9725-42c0-afff-9f709579538b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.293053 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqqm\" (UniqueName: \"kubernetes.io/projected/e3afe0ea-9725-42c0-afff-9f709579538b-kube-api-access-mfqqm\") pod \"dnsmasq-dns-675f4bcbfc-bdh62\" (UID: \"e3afe0ea-9725-42c0-afff-9f709579538b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.394867 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqqm\" (UniqueName: \"kubernetes.io/projected/e3afe0ea-9725-42c0-afff-9f709579538b-kube-api-access-mfqqm\") pod \"dnsmasq-dns-675f4bcbfc-bdh62\" (UID: \"e3afe0ea-9725-42c0-afff-9f709579538b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.394915 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-config\") pod \"dnsmasq-dns-78dd6ddcc-dzm4t\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.394938 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dzm4t\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.394965 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnzk\" (UniqueName: \"kubernetes.io/projected/26ea9406-be90-4524-88fa-b158bf964acc-kube-api-access-cgnzk\") pod \"dnsmasq-dns-78dd6ddcc-dzm4t\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.395009 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3afe0ea-9725-42c0-afff-9f709579538b-config\") pod \"dnsmasq-dns-675f4bcbfc-bdh62\" (UID: \"e3afe0ea-9725-42c0-afff-9f709579538b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.395862 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3afe0ea-9725-42c0-afff-9f709579538b-config\") pod \"dnsmasq-dns-675f4bcbfc-bdh62\" (UID: \"e3afe0ea-9725-42c0-afff-9f709579538b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.420238 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqqm\" (UniqueName: \"kubernetes.io/projected/e3afe0ea-9725-42c0-afff-9f709579538b-kube-api-access-mfqqm\") pod \"dnsmasq-dns-675f4bcbfc-bdh62\" (UID: \"e3afe0ea-9725-42c0-afff-9f709579538b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.496091 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-config\") pod \"dnsmasq-dns-78dd6ddcc-dzm4t\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.496137 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dzm4t\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.496167 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnzk\" (UniqueName: \"kubernetes.io/projected/26ea9406-be90-4524-88fa-b158bf964acc-kube-api-access-cgnzk\") pod \"dnsmasq-dns-78dd6ddcc-dzm4t\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.496911 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dzm4t\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.497163 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-config\") pod \"dnsmasq-dns-78dd6ddcc-dzm4t\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.497624 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.512038 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnzk\" (UniqueName: \"kubernetes.io/projected/26ea9406-be90-4524-88fa-b158bf964acc-kube-api-access-cgnzk\") pod \"dnsmasq-dns-78dd6ddcc-dzm4t\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.545970 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.972727 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bdh62"] Oct 07 08:33:01 crc kubenswrapper[5025]: I1007 08:33:01.978922 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 08:33:02 crc kubenswrapper[5025]: I1007 08:33:02.037209 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dzm4t"] Oct 07 08:33:02 crc kubenswrapper[5025]: W1007 08:33:02.040968 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26ea9406_be90_4524_88fa_b158bf964acc.slice/crio-30136bc35d22a7c378152e02cec7c7c16b3ecc615c9b7837c73510ec9a6034ff WatchSource:0}: Error finding container 30136bc35d22a7c378152e02cec7c7c16b3ecc615c9b7837c73510ec9a6034ff: Status 404 returned error can't find the container with id 30136bc35d22a7c378152e02cec7c7c16b3ecc615c9b7837c73510ec9a6034ff Oct 07 08:33:02 crc kubenswrapper[5025]: I1007 08:33:02.853264 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" event={"ID":"26ea9406-be90-4524-88fa-b158bf964acc","Type":"ContainerStarted","Data":"30136bc35d22a7c378152e02cec7c7c16b3ecc615c9b7837c73510ec9a6034ff"} Oct 07 08:33:02 crc kubenswrapper[5025]: I1007 08:33:02.854762 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" event={"ID":"e3afe0ea-9725-42c0-afff-9f709579538b","Type":"ContainerStarted","Data":"9b19a93af3757d75ef80640476d1e311bfa6586dc7d4df8392df8df156b49f95"} Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.284420 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bdh62"] Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.305964 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5jzqc"] Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.307458 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.316901 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5jzqc"] Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.432635 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2kgh\" (UniqueName: \"kubernetes.io/projected/e570be52-2eb1-405e-af00-50e094030f46-kube-api-access-c2kgh\") pod \"dnsmasq-dns-666b6646f7-5jzqc\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.432706 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-config\") pod \"dnsmasq-dns-666b6646f7-5jzqc\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.432795 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5jzqc\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.534254 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5jzqc\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.534378 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2kgh\" (UniqueName: \"kubernetes.io/projected/e570be52-2eb1-405e-af00-50e094030f46-kube-api-access-c2kgh\") pod \"dnsmasq-dns-666b6646f7-5jzqc\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.534445 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-config\") pod \"dnsmasq-dns-666b6646f7-5jzqc\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.535585 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5jzqc\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.535644 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-config\") pod \"dnsmasq-dns-666b6646f7-5jzqc\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.554516 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2kgh\" (UniqueName: \"kubernetes.io/projected/e570be52-2eb1-405e-af00-50e094030f46-kube-api-access-c2kgh\") pod \"dnsmasq-dns-666b6646f7-5jzqc\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.643676 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.966248 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dzm4t"] Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.988005 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wftj8"] Oct 07 08:33:03 crc kubenswrapper[5025]: I1007 08:33:03.989338 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.008033 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wftj8"] Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.124957 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5jzqc"] Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.157899 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz27n\" (UniqueName: \"kubernetes.io/projected/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-kube-api-access-kz27n\") pod \"dnsmasq-dns-57d769cc4f-wftj8\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.157962 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wftj8\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.158204 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-config\") pod \"dnsmasq-dns-57d769cc4f-wftj8\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.260246 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-config\") pod \"dnsmasq-dns-57d769cc4f-wftj8\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.260374 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz27n\" (UniqueName: \"kubernetes.io/projected/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-kube-api-access-kz27n\") pod \"dnsmasq-dns-57d769cc4f-wftj8\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.260420 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wftj8\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.261330 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-config\") pod \"dnsmasq-dns-57d769cc4f-wftj8\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.261685 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wftj8\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.277323 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz27n\" (UniqueName: \"kubernetes.io/projected/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-kube-api-access-kz27n\") pod \"dnsmasq-dns-57d769cc4f-wftj8\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.308996 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.466200 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.467937 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.471313 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.474599 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.474899 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.475045 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.475194 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.475445 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.475577 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9tt4z" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.483928 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680145 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680232 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680343 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680391 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680478 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d46577dd-b38b-4b80-ad57-577629e648b8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680509 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680672 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54kct\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-kube-api-access-54kct\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680709 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680724 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680750 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d46577dd-b38b-4b80-ad57-577629e648b8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.680801 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.781532 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.781583 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.781605 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.781635 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d46577dd-b38b-4b80-ad57-577629e648b8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.781656 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.781702 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54kct\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-kube-api-access-54kct\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.781718 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.781733 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.782112 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.782369 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.782587 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.783042 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.784908 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.786703 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d46577dd-b38b-4b80-ad57-577629e648b8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.788626 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d46577dd-b38b-4b80-ad57-577629e648b8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.788745 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.789022 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.789295 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.790317 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.793401 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.801144 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54kct\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-kube-api-access-54kct\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.808740 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d46577dd-b38b-4b80-ad57-577629e648b8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.814719 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " pod="openstack/rabbitmq-server-0" Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.878498 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wftj8"] Oct 07 08:33:04 crc kubenswrapper[5025]: W1007 08:33:04.887425 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e7ac34_8558_497d_be28_0d3ad7ff31ad.slice/crio-882c4573a7c8c74b74d65d9c4d404435118c576c414c37f851ad67ca85766fc6 WatchSource:0}: Error finding container 882c4573a7c8c74b74d65d9c4d404435118c576c414c37f851ad67ca85766fc6: Status 404 returned error can't find the container with id 882c4573a7c8c74b74d65d9c4d404435118c576c414c37f851ad67ca85766fc6 Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.902565 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" event={"ID":"e570be52-2eb1-405e-af00-50e094030f46","Type":"ContainerStarted","Data":"e16dac15495996a57eacf62936a863d3ffde7ad84e60a657eb24f03f39846313"} Oct 07 08:33:04 crc kubenswrapper[5025]: I1007 08:33:04.903478 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" event={"ID":"f6e7ac34-8558-497d-be28-0d3ad7ff31ad","Type":"ContainerStarted","Data":"882c4573a7c8c74b74d65d9c4d404435118c576c414c37f851ad67ca85766fc6"} Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.101114 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.103598 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.108915 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.112243 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.112382 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.112822 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.113259 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.113584 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.113764 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.113916 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tqx22" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.114472 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294457 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294495 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294522 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294559 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294593 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294635 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294651 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294671 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xprqv\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-kube-api-access-xprqv\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294685 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294720 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.294736 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398563 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398615 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398649 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xprqv\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-kube-api-access-xprqv\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398675 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398715 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398749 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398815 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398842 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398893 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.398944 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.399008 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.399631 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.400013 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.403163 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.404508 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.405067 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.406430 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.408845 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.419594 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.421806 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xprqv\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-kube-api-access-xprqv\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.425266 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.432896 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.437065 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.482815 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.597884 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 08:33:05 crc kubenswrapper[5025]: I1007 08:33:05.926842 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d46577dd-b38b-4b80-ad57-577629e648b8","Type":"ContainerStarted","Data":"b2b0b0fc44802fdc90d87c910fb92166510eeeddfd8898e51afe656526841124"} Oct 07 08:33:06 crc kubenswrapper[5025]: I1007 08:33:06.063490 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 08:33:07 crc kubenswrapper[5025]: I1007 08:33:07.997263 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.001096 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.010081 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5kj4p" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.010465 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.011875 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.013878 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.015639 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.016405 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.030322 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.125921 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.127228 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.135698 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.141841 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.142660 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.142832 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.143181 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-twnzl" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.160765 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-operator-scripts\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.160812 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.160873 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kolla-config\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.160893 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-secrets\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.160926 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.160974 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-generated\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.161004 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mz5b\" (UniqueName: \"kubernetes.io/projected/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kube-api-access-5mz5b\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.161047 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.161071 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-default\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.261998 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262040 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-default\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262068 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262089 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-operator-scripts\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262104 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262128 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scnsr\" (UniqueName: \"kubernetes.io/projected/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kube-api-access-scnsr\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262145 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262169 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262196 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-secrets\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262211 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kolla-config\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262235 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262264 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262282 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262303 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.262899 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.263460 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-default\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.263893 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kolla-config\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.263968 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-generated\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.264022 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.264482 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-generated\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.264885 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mz5b\" (UniqueName: \"kubernetes.io/projected/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kube-api-access-5mz5b\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.265732 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.278885 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-operator-scripts\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.292303 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.292909 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.293073 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-secrets\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.301455 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.301603 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mz5b\" (UniqueName: \"kubernetes.io/projected/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kube-api-access-5mz5b\") pod \"openstack-galera-0\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.328777 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.350009 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.351491 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.354383 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.357106 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.359029 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-psnr5" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.367689 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.367766 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scnsr\" (UniqueName: \"kubernetes.io/projected/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kube-api-access-scnsr\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.367794 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.367826 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.367890 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.367913 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.367945 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.367965 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.367993 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.369763 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.369812 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.371303 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.372421 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.372903 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.374499 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.374961 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.376424 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.414440 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scnsr\" (UniqueName: \"kubernetes.io/projected/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kube-api-access-scnsr\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.423025 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.430467 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.457756 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.469271 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-config-data\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.469356 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-kolla-config\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.469399 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-combined-ca-bundle\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.469430 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28nn8\" (UniqueName: \"kubernetes.io/projected/19d1598f-6725-40cd-99bd-5ee1bf699225-kube-api-access-28nn8\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.469481 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-memcached-tls-certs\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.571236 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-config-data\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.571410 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-kolla-config\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.571452 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-combined-ca-bundle\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.571487 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28nn8\" (UniqueName: \"kubernetes.io/projected/19d1598f-6725-40cd-99bd-5ee1bf699225-kube-api-access-28nn8\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.571509 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-memcached-tls-certs\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.572180 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-kolla-config\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.572799 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-config-data\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.574592 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-combined-ca-bundle\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.575865 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-memcached-tls-certs\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.587012 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28nn8\" (UniqueName: \"kubernetes.io/projected/19d1598f-6725-40cd-99bd-5ee1bf699225-kube-api-access-28nn8\") pod \"memcached-0\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " pod="openstack/memcached-0" Oct 07 08:33:08 crc kubenswrapper[5025]: I1007 08:33:08.761842 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 08:33:09 crc kubenswrapper[5025]: I1007 08:33:09.880735 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:33:09 crc kubenswrapper[5025]: I1007 08:33:09.882331 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 08:33:09 crc kubenswrapper[5025]: I1007 08:33:09.885018 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-qzzvr" Oct 07 08:33:09 crc kubenswrapper[5025]: I1007 08:33:09.899626 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:33:09 crc kubenswrapper[5025]: I1007 08:33:09.994561 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nknls\" (UniqueName: \"kubernetes.io/projected/77ea4bce-e477-4798-923e-ce17548441d6-kube-api-access-nknls\") pod \"kube-state-metrics-0\" (UID: \"77ea4bce-e477-4798-923e-ce17548441d6\") " pod="openstack/kube-state-metrics-0" Oct 07 08:33:10 crc kubenswrapper[5025]: I1007 08:33:10.096276 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nknls\" (UniqueName: \"kubernetes.io/projected/77ea4bce-e477-4798-923e-ce17548441d6-kube-api-access-nknls\") pod \"kube-state-metrics-0\" (UID: \"77ea4bce-e477-4798-923e-ce17548441d6\") " pod="openstack/kube-state-metrics-0" Oct 07 08:33:10 crc kubenswrapper[5025]: I1007 08:33:10.121440 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nknls\" (UniqueName: \"kubernetes.io/projected/77ea4bce-e477-4798-923e-ce17548441d6-kube-api-access-nknls\") pod \"kube-state-metrics-0\" (UID: \"77ea4bce-e477-4798-923e-ce17548441d6\") " pod="openstack/kube-state-metrics-0" Oct 07 08:33:10 crc kubenswrapper[5025]: I1007 08:33:10.203298 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 08:33:10 crc kubenswrapper[5025]: I1007 08:33:10.968842 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5a4ae5a-0b64-481f-a54d-2263de0eae8e","Type":"ContainerStarted","Data":"51d681999300e850220197bc7751e646ddb9d25bcc355393219c1ada22f907ed"} Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.013444 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bxmbm"] Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.014978 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.018979 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5fq7r" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.019213 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.019361 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.074737 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bxmbm"] Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.095787 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-lms8w"] Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.097895 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lms8w"] Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.098140 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.185475 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-lib\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.185592 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-run\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.185631 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-combined-ca-bundle\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.185663 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-log-ovn\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.185711 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run-ovn\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.185740 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-scripts\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.186142 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-log\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.186236 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-scripts\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.186265 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.186329 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-etc-ovs\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.186399 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g85bb\" (UniqueName: \"kubernetes.io/projected/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-kube-api-access-g85bb\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.186455 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-ovn-controller-tls-certs\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.186481 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrhf\" (UniqueName: \"kubernetes.io/projected/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-kube-api-access-knrhf\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.288262 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-lib\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.288322 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-run\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.288349 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-combined-ca-bundle\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.288370 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-log-ovn\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.288390 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run-ovn\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.288989 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-log-ovn\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289011 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run-ovn\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.288408 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-scripts\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289075 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-log\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289127 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-run\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289176 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-log\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289094 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-scripts\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289217 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289241 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-etc-ovs\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289285 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g85bb\" (UniqueName: \"kubernetes.io/projected/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-kube-api-access-g85bb\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289298 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289488 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-etc-ovs\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289300 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-ovn-controller-tls-certs\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.289528 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knrhf\" (UniqueName: \"kubernetes.io/projected/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-kube-api-access-knrhf\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.290018 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-lib\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.291139 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-scripts\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.292534 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-scripts\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.297863 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-combined-ca-bundle\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.299202 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-ovn-controller-tls-certs\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.307773 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g85bb\" (UniqueName: \"kubernetes.io/projected/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-kube-api-access-g85bb\") pod \"ovn-controller-ovs-lms8w\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.316115 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrhf\" (UniqueName: \"kubernetes.io/projected/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-kube-api-access-knrhf\") pod \"ovn-controller-bxmbm\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.384202 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.431601 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.475159 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.476745 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.479441 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.480485 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.481181 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.482653 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-kgsg4" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.482671 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.536364 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.595623 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.595710 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n89z9\" (UniqueName: \"kubernetes.io/projected/e8ac1818-6553-400f-91be-19b032aae626-kube-api-access-n89z9\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.595768 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.596050 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.596478 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac1818-6553-400f-91be-19b032aae626-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.596582 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.596616 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.596901 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-config\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.698190 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.698253 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n89z9\" (UniqueName: \"kubernetes.io/projected/e8ac1818-6553-400f-91be-19b032aae626-kube-api-access-n89z9\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.698293 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.698318 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.698356 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac1818-6553-400f-91be-19b032aae626-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.698379 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.698394 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.698427 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-config\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.699367 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-config\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.700931 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.701441 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.701684 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac1818-6553-400f-91be-19b032aae626-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.703813 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.704758 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.711040 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.717034 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n89z9\" (UniqueName: \"kubernetes.io/projected/e8ac1818-6553-400f-91be-19b032aae626-kube-api-access-n89z9\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.730583 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:14 crc kubenswrapper[5025]: I1007 08:33:14.803813 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.872871 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.875461 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.877405 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wj658" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.881779 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.884220 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.885520 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.885598 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.938497 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.938570 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.938646 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d64s2\" (UniqueName: \"kubernetes.io/projected/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-kube-api-access-d64s2\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.938671 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.938709 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.938865 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.938912 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-config\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:16 crc kubenswrapper[5025]: I1007 08:33:16.938933 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.041459 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.041579 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-config\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.041618 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.041716 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.042480 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.042738 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-config\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.043934 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.044940 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.045225 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d64s2\" (UniqueName: \"kubernetes.io/projected/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-kube-api-access-d64s2\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.045293 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.045486 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.045825 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.047226 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.048123 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.054651 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.068734 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d64s2\" (UniqueName: \"kubernetes.io/projected/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-kube-api-access-d64s2\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.071039 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:17 crc kubenswrapper[5025]: I1007 08:33:17.203604 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.864944 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.865352 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kz27n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-wftj8_openstack(f6e7ac34-8558-497d-be28-0d3ad7ff31ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.866549 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" podUID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.884111 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.884246 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mfqqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-bdh62_openstack(e3afe0ea-9725-42c0-afff-9f709579538b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.884632 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.884718 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cgnzk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-dzm4t_openstack(26ea9406-be90-4524-88fa-b158bf964acc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.885746 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" podUID="e3afe0ea-9725-42c0-afff-9f709579538b" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.885855 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" podUID="26ea9406-be90-4524-88fa-b158bf964acc" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.896668 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.896793 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c2kgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-5jzqc_openstack(e570be52-2eb1-405e-af00-50e094030f46): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 08:33:19 crc kubenswrapper[5025]: E1007 08:33:19.898474 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" podUID="e570be52-2eb1-405e-af00-50e094030f46" Oct 07 08:33:20 crc kubenswrapper[5025]: E1007 08:33:20.042480 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" podUID="e570be52-2eb1-405e-af00-50e094030f46" Oct 07 08:33:20 crc kubenswrapper[5025]: E1007 08:33:20.043353 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" podUID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.055160 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" event={"ID":"e3afe0ea-9725-42c0-afff-9f709579538b","Type":"ContainerDied","Data":"9b19a93af3757d75ef80640476d1e311bfa6586dc7d4df8392df8df156b49f95"} Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.055593 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b19a93af3757d75ef80640476d1e311bfa6586dc7d4df8392df8df156b49f95" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.055200 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.057765 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" event={"ID":"26ea9406-be90-4524-88fa-b158bf964acc","Type":"ContainerDied","Data":"30136bc35d22a7c378152e02cec7c7c16b3ecc615c9b7837c73510ec9a6034ff"} Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.075181 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.110595 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-config\") pod \"26ea9406-be90-4524-88fa-b158bf964acc\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.110685 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-dns-svc\") pod \"26ea9406-be90-4524-88fa-b158bf964acc\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.110724 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgnzk\" (UniqueName: \"kubernetes.io/projected/26ea9406-be90-4524-88fa-b158bf964acc-kube-api-access-cgnzk\") pod \"26ea9406-be90-4524-88fa-b158bf964acc\" (UID: \"26ea9406-be90-4524-88fa-b158bf964acc\") " Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.111231 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-config" (OuterVolumeSpecName: "config") pod "26ea9406-be90-4524-88fa-b158bf964acc" (UID: "26ea9406-be90-4524-88fa-b158bf964acc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.111690 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26ea9406-be90-4524-88fa-b158bf964acc" (UID: "26ea9406-be90-4524-88fa-b158bf964acc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.112392 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.112423 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ea9406-be90-4524-88fa-b158bf964acc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.139281 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ea9406-be90-4524-88fa-b158bf964acc-kube-api-access-cgnzk" (OuterVolumeSpecName: "kube-api-access-cgnzk") pod "26ea9406-be90-4524-88fa-b158bf964acc" (UID: "26ea9406-be90-4524-88fa-b158bf964acc"). InnerVolumeSpecName "kube-api-access-cgnzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.214012 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfqqm\" (UniqueName: \"kubernetes.io/projected/e3afe0ea-9725-42c0-afff-9f709579538b-kube-api-access-mfqqm\") pod \"e3afe0ea-9725-42c0-afff-9f709579538b\" (UID: \"e3afe0ea-9725-42c0-afff-9f709579538b\") " Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.214089 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3afe0ea-9725-42c0-afff-9f709579538b-config\") pod \"e3afe0ea-9725-42c0-afff-9f709579538b\" (UID: \"e3afe0ea-9725-42c0-afff-9f709579538b\") " Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.214455 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgnzk\" (UniqueName: \"kubernetes.io/projected/26ea9406-be90-4524-88fa-b158bf964acc-kube-api-access-cgnzk\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.215094 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3afe0ea-9725-42c0-afff-9f709579538b-config" (OuterVolumeSpecName: "config") pod "e3afe0ea-9725-42c0-afff-9f709579538b" (UID: "e3afe0ea-9725-42c0-afff-9f709579538b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.223913 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3afe0ea-9725-42c0-afff-9f709579538b-kube-api-access-mfqqm" (OuterVolumeSpecName: "kube-api-access-mfqqm") pod "e3afe0ea-9725-42c0-afff-9f709579538b" (UID: "e3afe0ea-9725-42c0-afff-9f709579538b"). InnerVolumeSpecName "kube-api-access-mfqqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.242426 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.316161 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfqqm\" (UniqueName: \"kubernetes.io/projected/e3afe0ea-9725-42c0-afff-9f709579538b-kube-api-access-mfqqm\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.316195 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3afe0ea-9725-42c0-afff-9f709579538b-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.482453 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.507897 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bxmbm"] Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.602188 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.621475 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 08:33:21 crc kubenswrapper[5025]: W1007 08:33:21.624417 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod007c50ab_22f3_4b11_a7e9_3890acbe7e03.slice/crio-3bf4a6b44cf327941278ecd9e4d8995e1457287a1f40777d3444505e068c5c94 WatchSource:0}: Error finding container 3bf4a6b44cf327941278ecd9e4d8995e1457287a1f40777d3444505e068c5c94: Status 404 returned error can't find the container with id 3bf4a6b44cf327941278ecd9e4d8995e1457287a1f40777d3444505e068c5c94 Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.699946 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 08:33:21 crc kubenswrapper[5025]: I1007 08:33:21.793531 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lms8w"] Oct 07 08:33:21 crc kubenswrapper[5025]: W1007 08:33:21.867716 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8ac1818_6553_400f_91be_19b032aae626.slice/crio-86561299917d5ef8ffba02ecb55fb4e8d3c4885b4082318aebefbe817c0e689e WatchSource:0}: Error finding container 86561299917d5ef8ffba02ecb55fb4e8d3c4885b4082318aebefbe817c0e689e: Status 404 returned error can't find the container with id 86561299917d5ef8ffba02ecb55fb4e8d3c4885b4082318aebefbe817c0e689e Oct 07 08:33:21 crc kubenswrapper[5025]: W1007 08:33:21.967359 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76d5c1b3_b23a_4ee4_ac74_fd5c1c075a9b.slice/crio-38b34840d30687489771c62719a5b494f8356d6c64c64ddd50a27149c0dd60d7 WatchSource:0}: Error finding container 38b34840d30687489771c62719a5b494f8356d6c64c64ddd50a27149c0dd60d7: Status 404 returned error can't find the container with id 38b34840d30687489771c62719a5b494f8356d6c64c64ddd50a27149c0dd60d7 Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.068255 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lms8w" event={"ID":"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b","Type":"ContainerStarted","Data":"38b34840d30687489771c62719a5b494f8356d6c64c64ddd50a27149c0dd60d7"} Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.069437 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77ea4bce-e477-4798-923e-ce17548441d6","Type":"ContainerStarted","Data":"d4bdc36363937919e754b21a3df0cb1680442793d62910b32e3e2b66f02b7441"} Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.070509 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm" event={"ID":"d6889a88-68ef-4bf3-9e7e-78c6d84785ae","Type":"ContainerStarted","Data":"8792842dcf47fbb891bdbacc6049db8d02de8a61256b9cb490a00a4e294f90ad"} Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.073650 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"582efaa6-0e81-462a-813f-7ba1cf9c6fc2","Type":"ContainerStarted","Data":"c1f90c6746a1f69ff4d80247adf14f633f0bcf5b531a2c02f27a420ab2cd3136"} Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.074752 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8ac1818-6553-400f-91be-19b032aae626","Type":"ContainerStarted","Data":"86561299917d5ef8ffba02ecb55fb4e8d3c4885b4082318aebefbe817c0e689e"} Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.079461 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"19d1598f-6725-40cd-99bd-5ee1bf699225","Type":"ContainerStarted","Data":"306521de0744c0f811b42c62aad2c297f03a0e0caa58bd0db54e2287bd51da46"} Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.080760 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"007c50ab-22f3-4b11-a7e9-3890acbe7e03","Type":"ContainerStarted","Data":"3bf4a6b44cf327941278ecd9e4d8995e1457287a1f40777d3444505e068c5c94"} Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.080811 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bdh62" Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.080859 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dzm4t" Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.242979 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dzm4t"] Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.257349 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dzm4t"] Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.271495 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bdh62"] Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.275721 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bdh62"] Oct 07 08:33:22 crc kubenswrapper[5025]: I1007 08:33:22.799081 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 08:33:23 crc kubenswrapper[5025]: I1007 08:33:23.093110 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8e01de5b-7e9d-4f70-91d9-d55165a8eb32","Type":"ContainerStarted","Data":"3945860612bcb71718e4cac07ea29b55c58ff38a86f4e0bb00a3d7968363e875"} Oct 07 08:33:23 crc kubenswrapper[5025]: I1007 08:33:23.954696 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26ea9406-be90-4524-88fa-b158bf964acc" path="/var/lib/kubelet/pods/26ea9406-be90-4524-88fa-b158bf964acc/volumes" Oct 07 08:33:23 crc kubenswrapper[5025]: I1007 08:33:23.955085 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3afe0ea-9725-42c0-afff-9f709579538b" path="/var/lib/kubelet/pods/e3afe0ea-9725-42c0-afff-9f709579538b/volumes" Oct 07 08:33:25 crc kubenswrapper[5025]: I1007 08:33:25.934719 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:33:25 crc kubenswrapper[5025]: I1007 08:33:25.935130 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:33:27 crc kubenswrapper[5025]: I1007 08:33:27.128068 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5a4ae5a-0b64-481f-a54d-2263de0eae8e","Type":"ContainerStarted","Data":"058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115"} Oct 07 08:33:27 crc kubenswrapper[5025]: I1007 08:33:27.129960 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d46577dd-b38b-4b80-ad57-577629e648b8","Type":"ContainerStarted","Data":"17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e"} Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.178444 5025 generic.go:334] "Generic (PLEG): container finished" podID="e570be52-2eb1-405e-af00-50e094030f46" containerID="cbf4fc57eff4b48f454fceb1bb4503cd279a7c3c8dd7781140149d4ebf748d17" exitCode=0 Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.179161 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" event={"ID":"e570be52-2eb1-405e-af00-50e094030f46","Type":"ContainerDied","Data":"cbf4fc57eff4b48f454fceb1bb4503cd279a7c3c8dd7781140149d4ebf748d17"} Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.181322 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8ac1818-6553-400f-91be-19b032aae626","Type":"ContainerStarted","Data":"bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8"} Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.184824 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"19d1598f-6725-40cd-99bd-5ee1bf699225","Type":"ContainerStarted","Data":"7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1"} Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.185096 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.187526 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"007c50ab-22f3-4b11-a7e9-3890acbe7e03","Type":"ContainerStarted","Data":"aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f"} Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.190227 5025 generic.go:334] "Generic (PLEG): container finished" podID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerID="25b1b4a37d5c5d5383c1c84c83a961dfa99f38923300c6f52f2fb768bcc4765f" exitCode=0 Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.190299 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lms8w" event={"ID":"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b","Type":"ContainerDied","Data":"25b1b4a37d5c5d5383c1c84c83a961dfa99f38923300c6f52f2fb768bcc4765f"} Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.192506 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"582efaa6-0e81-462a-813f-7ba1cf9c6fc2","Type":"ContainerStarted","Data":"186337599c994f3ad1b9bcddc6d6042bacd8e770be7e70f367b17c08ed07fea6"} Oct 07 08:33:34 crc kubenswrapper[5025]: I1007 08:33:34.263663 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.648181233 podStartE2EDuration="26.263641657s" podCreationTimestamp="2025-10-07 08:33:08 +0000 UTC" firstStartedPulling="2025-10-07 08:33:21.249525239 +0000 UTC m=+1008.058839383" lastFinishedPulling="2025-10-07 08:33:31.864985653 +0000 UTC m=+1018.674299807" observedRunningTime="2025-10-07 08:33:34.262993297 +0000 UTC m=+1021.072307461" watchObservedRunningTime="2025-10-07 08:33:34.263641657 +0000 UTC m=+1021.072955801" Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.201576 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77ea4bce-e477-4798-923e-ce17548441d6","Type":"ContainerStarted","Data":"a2079e2c81626912c745624ab9276c37b22464ca45f2a2017cb7b9516c2562e6"} Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.201838 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.203699 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm" event={"ID":"d6889a88-68ef-4bf3-9e7e-78c6d84785ae","Type":"ContainerStarted","Data":"628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37"} Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.203879 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-bxmbm" Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.207491 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" event={"ID":"e570be52-2eb1-405e-af00-50e094030f46","Type":"ContainerStarted","Data":"d3583812b67d25c3cffec6299db26c85ab2b8ee3a8f43fb85a12a8e29d1ff65f"} Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.207692 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.210035 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8e01de5b-7e9d-4f70-91d9-d55165a8eb32","Type":"ContainerStarted","Data":"ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535"} Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.212502 5025 generic.go:334] "Generic (PLEG): container finished" podID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" containerID="98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d" exitCode=0 Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.212568 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" event={"ID":"f6e7ac34-8558-497d-be28-0d3ad7ff31ad","Type":"ContainerDied","Data":"98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d"} Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.216130 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.816015114 podStartE2EDuration="26.216116509s" podCreationTimestamp="2025-10-07 08:33:09 +0000 UTC" firstStartedPulling="2025-10-07 08:33:21.617509268 +0000 UTC m=+1008.426823412" lastFinishedPulling="2025-10-07 08:33:34.017610663 +0000 UTC m=+1020.826924807" observedRunningTime="2025-10-07 08:33:35.213482387 +0000 UTC m=+1022.022796531" watchObservedRunningTime="2025-10-07 08:33:35.216116509 +0000 UTC m=+1022.025430653" Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.225247 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lms8w" event={"ID":"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b","Type":"ContainerStarted","Data":"0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac"} Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.225309 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lms8w" event={"ID":"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b","Type":"ContainerStarted","Data":"b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49"} Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.233519 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" podStartSLOduration=2.9846953640000002 podStartE2EDuration="32.233499642s" podCreationTimestamp="2025-10-07 08:33:03 +0000 UTC" firstStartedPulling="2025-10-07 08:33:04.16889965 +0000 UTC m=+990.978213784" lastFinishedPulling="2025-10-07 08:33:33.417703918 +0000 UTC m=+1020.227018062" observedRunningTime="2025-10-07 08:33:35.229802117 +0000 UTC m=+1022.039116261" watchObservedRunningTime="2025-10-07 08:33:35.233499642 +0000 UTC m=+1022.042813786" Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.252599 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bxmbm" podStartSLOduration=11.572825179 podStartE2EDuration="22.252581207s" podCreationTimestamp="2025-10-07 08:33:13 +0000 UTC" firstStartedPulling="2025-10-07 08:33:21.517163479 +0000 UTC m=+1008.326477623" lastFinishedPulling="2025-10-07 08:33:32.196919497 +0000 UTC m=+1019.006233651" observedRunningTime="2025-10-07 08:33:35.24565384 +0000 UTC m=+1022.054967994" watchObservedRunningTime="2025-10-07 08:33:35.252581207 +0000 UTC m=+1022.061895351" Oct 07 08:33:35 crc kubenswrapper[5025]: I1007 08:33:35.281404 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-lms8w" podStartSLOduration=11.132597571 podStartE2EDuration="21.281387746s" podCreationTimestamp="2025-10-07 08:33:14 +0000 UTC" firstStartedPulling="2025-10-07 08:33:21.970416048 +0000 UTC m=+1008.779730232" lastFinishedPulling="2025-10-07 08:33:32.119206263 +0000 UTC m=+1018.928520407" observedRunningTime="2025-10-07 08:33:35.27994561 +0000 UTC m=+1022.089259754" watchObservedRunningTime="2025-10-07 08:33:35.281387746 +0000 UTC m=+1022.090701890" Oct 07 08:33:36 crc kubenswrapper[5025]: I1007 08:33:36.240908 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" event={"ID":"f6e7ac34-8558-497d-be28-0d3ad7ff31ad","Type":"ContainerStarted","Data":"579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209"} Oct 07 08:33:36 crc kubenswrapper[5025]: I1007 08:33:36.241281 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:36 crc kubenswrapper[5025]: I1007 08:33:36.241299 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:33:36 crc kubenswrapper[5025]: I1007 08:33:36.258599 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" podStartSLOduration=4.129757091 podStartE2EDuration="33.258580658s" podCreationTimestamp="2025-10-07 08:33:03 +0000 UTC" firstStartedPulling="2025-10-07 08:33:04.890252101 +0000 UTC m=+991.699566245" lastFinishedPulling="2025-10-07 08:33:34.019075668 +0000 UTC m=+1020.828389812" observedRunningTime="2025-10-07 08:33:36.255576965 +0000 UTC m=+1023.064891109" watchObservedRunningTime="2025-10-07 08:33:36.258580658 +0000 UTC m=+1023.067894802" Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.268063 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8e01de5b-7e9d-4f70-91d9-d55165a8eb32","Type":"ContainerStarted","Data":"ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382"} Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.269683 5025 generic.go:334] "Generic (PLEG): container finished" podID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" containerID="aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f" exitCode=0 Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.269753 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"007c50ab-22f3-4b11-a7e9-3890acbe7e03","Type":"ContainerDied","Data":"aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f"} Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.271324 5025 generic.go:334] "Generic (PLEG): container finished" podID="582efaa6-0e81-462a-813f-7ba1cf9c6fc2" containerID="186337599c994f3ad1b9bcddc6d6042bacd8e770be7e70f367b17c08ed07fea6" exitCode=0 Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.271400 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"582efaa6-0e81-462a-813f-7ba1cf9c6fc2","Type":"ContainerDied","Data":"186337599c994f3ad1b9bcddc6d6042bacd8e770be7e70f367b17c08ed07fea6"} Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.273793 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8ac1818-6553-400f-91be-19b032aae626","Type":"ContainerStarted","Data":"e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd"} Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.301892 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.611224391 podStartE2EDuration="23.301869988s" podCreationTimestamp="2025-10-07 08:33:15 +0000 UTC" firstStartedPulling="2025-10-07 08:33:22.812356851 +0000 UTC m=+1009.621671035" lastFinishedPulling="2025-10-07 08:33:37.503002488 +0000 UTC m=+1024.312316632" observedRunningTime="2025-10-07 08:33:38.295121267 +0000 UTC m=+1025.104435431" watchObservedRunningTime="2025-10-07 08:33:38.301869988 +0000 UTC m=+1025.111184132" Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.326295 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.678778725 podStartE2EDuration="25.326273539s" podCreationTimestamp="2025-10-07 08:33:13 +0000 UTC" firstStartedPulling="2025-10-07 08:33:21.870635875 +0000 UTC m=+1008.679950029" lastFinishedPulling="2025-10-07 08:33:37.518130699 +0000 UTC m=+1024.327444843" observedRunningTime="2025-10-07 08:33:38.321448888 +0000 UTC m=+1025.130763032" watchObservedRunningTime="2025-10-07 08:33:38.326273539 +0000 UTC m=+1025.135587683" Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.766202 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.804904 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:38 crc kubenswrapper[5025]: I1007 08:33:38.854263 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.288998 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"007c50ab-22f3-4b11-a7e9-3890acbe7e03","Type":"ContainerStarted","Data":"7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8"} Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.292022 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"582efaa6-0e81-462a-813f-7ba1cf9c6fc2","Type":"ContainerStarted","Data":"681d0f55b346cbe735938996d113e2740ed0c5031feeac188e0b54605836e911"} Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.292534 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.310059 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.312290 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.085947447 podStartE2EDuration="33.312272926s" podCreationTimestamp="2025-10-07 08:33:06 +0000 UTC" firstStartedPulling="2025-10-07 08:33:21.626607522 +0000 UTC m=+1008.435921666" lastFinishedPulling="2025-10-07 08:33:32.852933001 +0000 UTC m=+1019.662247145" observedRunningTime="2025-10-07 08:33:39.307005122 +0000 UTC m=+1026.116319296" watchObservedRunningTime="2025-10-07 08:33:39.312272926 +0000 UTC m=+1026.121587110" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.336416 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.654299647 podStartE2EDuration="32.336396439s" podCreationTimestamp="2025-10-07 08:33:07 +0000 UTC" firstStartedPulling="2025-10-07 08:33:21.514769814 +0000 UTC m=+1008.324083958" lastFinishedPulling="2025-10-07 08:33:32.196866606 +0000 UTC m=+1019.006180750" observedRunningTime="2025-10-07 08:33:39.329881356 +0000 UTC m=+1026.139195500" watchObservedRunningTime="2025-10-07 08:33:39.336396439 +0000 UTC m=+1026.145710603" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.347876 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.608473 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wftj8"] Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.639084 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-z8jm7"] Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.640861 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.642834 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.654538 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-z8jm7"] Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.721664 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t4qz\" (UniqueName: \"kubernetes.io/projected/3d21b8ac-abd6-4aed-bf55-26aed66f02db-kube-api-access-2t4qz\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.721807 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.721852 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-config\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.721890 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.737066 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-cb69l"] Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.738688 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.750311 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cb69l"] Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.750579 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828440 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t4qz\" (UniqueName: \"kubernetes.io/projected/3d21b8ac-abd6-4aed-bf55-26aed66f02db-kube-api-access-2t4qz\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828518 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828589 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-config\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828621 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovs-rundir\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828647 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828670 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-config\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828703 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-combined-ca-bundle\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828738 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828775 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovn-rundir\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.828800 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh8ww\" (UniqueName: \"kubernetes.io/projected/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-kube-api-access-zh8ww\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.830040 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-config\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.830217 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.830510 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.853391 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t4qz\" (UniqueName: \"kubernetes.io/projected/3d21b8ac-abd6-4aed-bf55-26aed66f02db-kube-api-access-2t4qz\") pod \"dnsmasq-dns-6bc7876d45-z8jm7\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.930418 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-config\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.930466 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovs-rundir\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.930512 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-combined-ca-bundle\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.930594 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovn-rundir\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.930614 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh8ww\" (UniqueName: \"kubernetes.io/projected/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-kube-api-access-zh8ww\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.930689 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.930918 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovs-rundir\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.931307 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovn-rundir\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.931496 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-config\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.942132 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-combined-ca-bundle\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.948631 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.959961 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:39 crc kubenswrapper[5025]: I1007 08:33:39.968084 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh8ww\" (UniqueName: \"kubernetes.io/projected/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-kube-api-access-zh8ww\") pod \"ovn-controller-metrics-cb69l\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.056252 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.079901 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5jzqc"] Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.084748 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" podUID="e570be52-2eb1-405e-af00-50e094030f46" containerName="dnsmasq-dns" containerID="cri-o://d3583812b67d25c3cffec6299db26c85ab2b8ee3a8f43fb85a12a8e29d1ff65f" gracePeriod=10 Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.088258 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.108105 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-xw2wv"] Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.109317 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.111466 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.130297 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xw2wv"] Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.134399 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgmx7\" (UniqueName: \"kubernetes.io/projected/b144a7bc-c1a2-4777-bca4-d0c36483851f-kube-api-access-sgmx7\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.134490 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.134569 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-config\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.134616 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.134661 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-dns-svc\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.215166 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.236046 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-dns-svc\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.236111 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgmx7\" (UniqueName: \"kubernetes.io/projected/b144a7bc-c1a2-4777-bca4-d0c36483851f-kube-api-access-sgmx7\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.236143 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.236187 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-config\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.236241 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.238968 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.239514 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-dns-svc\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.240239 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.240776 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-config\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.267783 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgmx7\" (UniqueName: \"kubernetes.io/projected/b144a7bc-c1a2-4777-bca4-d0c36483851f-kube-api-access-sgmx7\") pod \"dnsmasq-dns-8554648995-xw2wv\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.324929 5025 generic.go:334] "Generic (PLEG): container finished" podID="e570be52-2eb1-405e-af00-50e094030f46" containerID="d3583812b67d25c3cffec6299db26c85ab2b8ee3a8f43fb85a12a8e29d1ff65f" exitCode=0 Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.326658 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" event={"ID":"e570be52-2eb1-405e-af00-50e094030f46","Type":"ContainerDied","Data":"d3583812b67d25c3cffec6299db26c85ab2b8ee3a8f43fb85a12a8e29d1ff65f"} Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.326800 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" podUID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" containerName="dnsmasq-dns" containerID="cri-o://579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209" gracePeriod=10 Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.334462 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.372414 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-z8jm7"] Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.419447 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rtggt"] Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.434344 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.455249 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rtggt"] Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.460761 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.550791 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.550832 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-config\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.550933 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.550988 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.551012 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcmxv\" (UniqueName: \"kubernetes.io/projected/9ec748e5-296b-4c72-979f-bf49a6dceb02-kube-api-access-gcmxv\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.652699 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcmxv\" (UniqueName: \"kubernetes.io/projected/9ec748e5-296b-4c72-979f-bf49a6dceb02-kube-api-access-gcmxv\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.652822 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.652843 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-config\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.652864 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.652896 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.653769 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.654636 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.655597 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.655703 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-config\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.675114 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcmxv\" (UniqueName: \"kubernetes.io/projected/9ec748e5-296b-4c72-979f-bf49a6dceb02-kube-api-access-gcmxv\") pod \"dnsmasq-dns-b8fbc5445-rtggt\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.707893 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.744727 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.790867 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.859435 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2kgh\" (UniqueName: \"kubernetes.io/projected/e570be52-2eb1-405e-af00-50e094030f46-kube-api-access-c2kgh\") pod \"e570be52-2eb1-405e-af00-50e094030f46\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.859475 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-config\") pod \"e570be52-2eb1-405e-af00-50e094030f46\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.859503 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-dns-svc\") pod \"e570be52-2eb1-405e-af00-50e094030f46\" (UID: \"e570be52-2eb1-405e-af00-50e094030f46\") " Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.859662 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz27n\" (UniqueName: \"kubernetes.io/projected/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-kube-api-access-kz27n\") pod \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.859732 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-dns-svc\") pod \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.859745 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-config\") pod \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\" (UID: \"f6e7ac34-8558-497d-be28-0d3ad7ff31ad\") " Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.867953 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e570be52-2eb1-405e-af00-50e094030f46-kube-api-access-c2kgh" (OuterVolumeSpecName: "kube-api-access-c2kgh") pod "e570be52-2eb1-405e-af00-50e094030f46" (UID: "e570be52-2eb1-405e-af00-50e094030f46"). InnerVolumeSpecName "kube-api-access-c2kgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.873767 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-kube-api-access-kz27n" (OuterVolumeSpecName: "kube-api-access-kz27n") pod "f6e7ac34-8558-497d-be28-0d3ad7ff31ad" (UID: "f6e7ac34-8558-497d-be28-0d3ad7ff31ad"). InnerVolumeSpecName "kube-api-access-kz27n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.923885 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e570be52-2eb1-405e-af00-50e094030f46" (UID: "e570be52-2eb1-405e-af00-50e094030f46"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.928777 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-config" (OuterVolumeSpecName: "config") pod "e570be52-2eb1-405e-af00-50e094030f46" (UID: "e570be52-2eb1-405e-af00-50e094030f46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.939533 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f6e7ac34-8558-497d-be28-0d3ad7ff31ad" (UID: "f6e7ac34-8558-497d-be28-0d3ad7ff31ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.946258 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-config" (OuterVolumeSpecName: "config") pod "f6e7ac34-8558-497d-be28-0d3ad7ff31ad" (UID: "f6e7ac34-8558-497d-be28-0d3ad7ff31ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.962354 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-z8jm7"] Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.962416 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz27n\" (UniqueName: \"kubernetes.io/projected/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-kube-api-access-kz27n\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.962453 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.962462 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e7ac34-8558-497d-be28-0d3ad7ff31ad-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.962473 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2kgh\" (UniqueName: \"kubernetes.io/projected/e570be52-2eb1-405e-af00-50e094030f46-kube-api-access-c2kgh\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.962481 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.962489 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e570be52-2eb1-405e-af00-50e094030f46-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:40 crc kubenswrapper[5025]: I1007 08:33:40.972982 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cb69l"] Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.056651 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xw2wv"] Oct 07 08:33:41 crc kubenswrapper[5025]: W1007 08:33:41.070740 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb144a7bc_c1a2_4777_bca4_d0c36483851f.slice/crio-05845f2d16fb8d9c2e155c711a9c49bc69c6641c46939cad7dee7c049bf68f28 WatchSource:0}: Error finding container 05845f2d16fb8d9c2e155c711a9c49bc69c6641c46939cad7dee7c049bf68f28: Status 404 returned error can't find the container with id 05845f2d16fb8d9c2e155c711a9c49bc69c6641c46939cad7dee7c049bf68f28 Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.204750 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.262080 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.274027 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rtggt"] Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.332610 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" event={"ID":"9ec748e5-296b-4c72-979f-bf49a6dceb02","Type":"ContainerStarted","Data":"5f370a52cfb12c373a8ae20d389d6de6f6bb04b43413b6e1ff0492a186ed3b05"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.334202 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cb69l" event={"ID":"7f76b8e5-9257-4a0f-8067-ac36ccbe6711","Type":"ContainerStarted","Data":"c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.334290 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cb69l" event={"ID":"7f76b8e5-9257-4a0f-8067-ac36ccbe6711","Type":"ContainerStarted","Data":"6d30afa32207026021934fd9f8733ffe56b970bdddee85e44462c5208c9fbeed"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.337029 5025 generic.go:334] "Generic (PLEG): container finished" podID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" containerID="579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209" exitCode=0 Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.337081 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" event={"ID":"f6e7ac34-8558-497d-be28-0d3ad7ff31ad","Type":"ContainerDied","Data":"579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.337100 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" event={"ID":"f6e7ac34-8558-497d-be28-0d3ad7ff31ad","Type":"ContainerDied","Data":"882c4573a7c8c74b74d65d9c4d404435118c576c414c37f851ad67ca85766fc6"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.337112 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wftj8" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.337116 5025 scope.go:117] "RemoveContainer" containerID="579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.338699 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" event={"ID":"3d21b8ac-abd6-4aed-bf55-26aed66f02db","Type":"ContainerStarted","Data":"914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.338734 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" event={"ID":"3d21b8ac-abd6-4aed-bf55-26aed66f02db","Type":"ContainerStarted","Data":"926d801363d7a3b41fd6f8b20d743387ff54f558d95834fe2db800c2bef9c977"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.338824 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" podUID="3d21b8ac-abd6-4aed-bf55-26aed66f02db" containerName="init" containerID="cri-o://914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4" gracePeriod=10 Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.341949 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xw2wv" event={"ID":"b144a7bc-c1a2-4777-bca4-d0c36483851f","Type":"ContainerStarted","Data":"ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.341988 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xw2wv" event={"ID":"b144a7bc-c1a2-4777-bca4-d0c36483851f","Type":"ContainerStarted","Data":"05845f2d16fb8d9c2e155c711a9c49bc69c6641c46939cad7dee7c049bf68f28"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.345316 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" event={"ID":"e570be52-2eb1-405e-af00-50e094030f46","Type":"ContainerDied","Data":"e16dac15495996a57eacf62936a863d3ffde7ad84e60a657eb24f03f39846313"} Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.345427 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5jzqc" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.346925 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.387886 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.434633 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.435028 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e570be52-2eb1-405e-af00-50e094030f46" containerName="init" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.435051 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e570be52-2eb1-405e-af00-50e094030f46" containerName="init" Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.435067 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" containerName="dnsmasq-dns" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.435074 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" containerName="dnsmasq-dns" Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.435085 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" containerName="init" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.435093 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" containerName="init" Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.435103 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e570be52-2eb1-405e-af00-50e094030f46" containerName="dnsmasq-dns" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.435110 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e570be52-2eb1-405e-af00-50e094030f46" containerName="dnsmasq-dns" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.435352 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" containerName="dnsmasq-dns" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.435384 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e570be52-2eb1-405e-af00-50e094030f46" containerName="dnsmasq-dns" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.442127 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.444736 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.444873 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.444881 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.444820 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jx22g" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.466152 5025 scope.go:117] "RemoveContainer" containerID="98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.472809 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.500112 5025 scope.go:117] "RemoveContainer" containerID="579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209" Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.500433 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209\": container with ID starting with 579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209 not found: ID does not exist" containerID="579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.500461 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209"} err="failed to get container status \"579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209\": rpc error: code = NotFound desc = could not find container \"579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209\": container with ID starting with 579c177fa0ab77c658cff1051ba7bb57a7ffd9755a0d4ca5214700d342831209 not found: ID does not exist" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.500480 5025 scope.go:117] "RemoveContainer" containerID="98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d" Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.500674 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d\": container with ID starting with 98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d not found: ID does not exist" containerID="98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.500702 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d"} err="failed to get container status \"98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d\": rpc error: code = NotFound desc = could not find container \"98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d\": container with ID starting with 98f0e23174b50f34880eb65a921f3f973ee435c1af868f07bb27fea79716351d not found: ID does not exist" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.500737 5025 scope.go:117] "RemoveContainer" containerID="d3583812b67d25c3cffec6299db26c85ab2b8ee3a8f43fb85a12a8e29d1ff65f" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.513990 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5jzqc"] Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.531779 5025 scope.go:117] "RemoveContainer" containerID="cbf4fc57eff4b48f454fceb1bb4503cd279a7c3c8dd7781140149d4ebf748d17" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.533687 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5jzqc"] Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.541706 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wftj8"] Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.584717 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-cache\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.584872 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.584918 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26x8x\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-kube-api-access-26x8x\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.584953 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-lock\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.584993 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.586319 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wftj8"] Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.625081 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.628208 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.630981 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.633843 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.634359 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-x9rbr" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.634841 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.642957 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.687133 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-lock\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.687521 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.687570 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-cache\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.687718 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.687766 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26x8x\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-kube-api-access-26x8x\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.690108 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-lock\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.690532 5025 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.690537 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-cache\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.690584 5025 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.690658 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift podName:9d7aa421-d9db-4ba6-882a-432d2e8b840f nodeName:}" failed. No retries permitted until 2025-10-07 08:33:42.190631029 +0000 UTC m=+1028.999945163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift") pod "swift-storage-0" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f") : configmap "swift-ring-files" not found Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.695867 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.708201 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26x8x\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-kube-api-access-26x8x\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.720524 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.745585 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789136 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t4qz\" (UniqueName: \"kubernetes.io/projected/3d21b8ac-abd6-4aed-bf55-26aed66f02db-kube-api-access-2t4qz\") pod \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789219 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-dns-svc\") pod \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789272 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-config\") pod \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789358 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-ovsdbserver-sb\") pod \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\" (UID: \"3d21b8ac-abd6-4aed-bf55-26aed66f02db\") " Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789718 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789772 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789793 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-scripts\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789817 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789839 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khlvp\" (UniqueName: \"kubernetes.io/projected/dd481497-9604-41b6-918a-a8ec9fd6ad92-kube-api-access-khlvp\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789869 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.789910 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-config\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.796748 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d21b8ac-abd6-4aed-bf55-26aed66f02db-kube-api-access-2t4qz" (OuterVolumeSpecName: "kube-api-access-2t4qz") pod "3d21b8ac-abd6-4aed-bf55-26aed66f02db" (UID: "3d21b8ac-abd6-4aed-bf55-26aed66f02db"). InnerVolumeSpecName "kube-api-access-2t4qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.818245 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d21b8ac-abd6-4aed-bf55-26aed66f02db" (UID: "3d21b8ac-abd6-4aed-bf55-26aed66f02db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.825795 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d21b8ac-abd6-4aed-bf55-26aed66f02db" (UID: "3d21b8ac-abd6-4aed-bf55-26aed66f02db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.827721 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-config" (OuterVolumeSpecName: "config") pod "3d21b8ac-abd6-4aed-bf55-26aed66f02db" (UID: "3d21b8ac-abd6-4aed-bf55-26aed66f02db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.891890 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.891968 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.891998 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-scripts\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.892049 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.892876 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.893160 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-scripts\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.892081 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khlvp\" (UniqueName: \"kubernetes.io/projected/dd481497-9604-41b6-918a-a8ec9fd6ad92-kube-api-access-khlvp\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.893292 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.893767 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-config\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.893832 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t4qz\" (UniqueName: \"kubernetes.io/projected/3d21b8ac-abd6-4aed-bf55-26aed66f02db-kube-api-access-2t4qz\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.893844 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.893855 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.893891 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d21b8ac-abd6-4aed-bf55-26aed66f02db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.895078 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-config\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.896977 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.901214 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.901263 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.919268 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khlvp\" (UniqueName: \"kubernetes.io/projected/dd481497-9604-41b6-918a-a8ec9fd6ad92-kube-api-access-khlvp\") pod \"ovn-northd-0\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " pod="openstack/ovn-northd-0" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.924014 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e570be52-2eb1-405e-af00-50e094030f46" path="/var/lib/kubelet/pods/e570be52-2eb1-405e-af00-50e094030f46/volumes" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.924618 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6e7ac34-8558-497d-be28-0d3ad7ff31ad" path="/var/lib/kubelet/pods/f6e7ac34-8558-497d-be28-0d3ad7ff31ad/volumes" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.952466 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-czj54"] Oct 07 08:33:41 crc kubenswrapper[5025]: E1007 08:33:41.952885 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d21b8ac-abd6-4aed-bf55-26aed66f02db" containerName="init" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.952902 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d21b8ac-abd6-4aed-bf55-26aed66f02db" containerName="init" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.953053 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d21b8ac-abd6-4aed-bf55-26aed66f02db" containerName="init" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.953618 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.955859 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.956442 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 07 08:33:41 crc kubenswrapper[5025]: I1007 08:33:41.956915 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.012392 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.023818 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-czj54"] Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.097383 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-combined-ca-bundle\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.097435 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-dispersionconf\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.097488 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cvhs\" (UniqueName: \"kubernetes.io/projected/4520ec77-3c49-4527-a716-90a88f6ec243-kube-api-access-8cvhs\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.097515 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-swiftconf\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.097577 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4520ec77-3c49-4527-a716-90a88f6ec243-etc-swift\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.097597 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-scripts\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.097616 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-ring-data-devices\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.200604 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4520ec77-3c49-4527-a716-90a88f6ec243-etc-swift\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.201111 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4520ec77-3c49-4527-a716-90a88f6ec243-etc-swift\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.201154 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-scripts\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.201186 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-ring-data-devices\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.201965 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.202054 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-combined-ca-bundle\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.202098 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-dispersionconf\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.202147 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cvhs\" (UniqueName: \"kubernetes.io/projected/4520ec77-3c49-4527-a716-90a88f6ec243-kube-api-access-8cvhs\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.202193 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-swiftconf\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.201907 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-scripts\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.201745 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-ring-data-devices\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: E1007 08:33:42.203292 5025 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 08:33:42 crc kubenswrapper[5025]: E1007 08:33:42.203310 5025 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 08:33:42 crc kubenswrapper[5025]: E1007 08:33:42.203348 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift podName:9d7aa421-d9db-4ba6-882a-432d2e8b840f nodeName:}" failed. No retries permitted until 2025-10-07 08:33:43.203335622 +0000 UTC m=+1030.012649766 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift") pod "swift-storage-0" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f") : configmap "swift-ring-files" not found Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.210235 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-swiftconf\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.214182 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-combined-ca-bundle\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.226164 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-dispersionconf\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.243427 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cvhs\" (UniqueName: \"kubernetes.io/projected/4520ec77-3c49-4527-a716-90a88f6ec243-kube-api-access-8cvhs\") pod \"swift-ring-rebalance-czj54\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.297253 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.366468 5025 generic.go:334] "Generic (PLEG): container finished" podID="9ec748e5-296b-4c72-979f-bf49a6dceb02" containerID="f294791cfaa038ec16871085a092ef3c33828bea6c8105be9a7cad946bafd253" exitCode=0 Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.366569 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" event={"ID":"9ec748e5-296b-4c72-979f-bf49a6dceb02","Type":"ContainerDied","Data":"f294791cfaa038ec16871085a092ef3c33828bea6c8105be9a7cad946bafd253"} Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.377555 5025 generic.go:334] "Generic (PLEG): container finished" podID="3d21b8ac-abd6-4aed-bf55-26aed66f02db" containerID="914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4" exitCode=0 Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.377640 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" event={"ID":"3d21b8ac-abd6-4aed-bf55-26aed66f02db","Type":"ContainerDied","Data":"914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4"} Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.377675 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" event={"ID":"3d21b8ac-abd6-4aed-bf55-26aed66f02db","Type":"ContainerDied","Data":"926d801363d7a3b41fd6f8b20d743387ff54f558d95834fe2db800c2bef9c977"} Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.377684 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-z8jm7" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.377695 5025 scope.go:117] "RemoveContainer" containerID="914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.387576 5025 generic.go:334] "Generic (PLEG): container finished" podID="b144a7bc-c1a2-4777-bca4-d0c36483851f" containerID="ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036" exitCode=0 Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.387707 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xw2wv" event={"ID":"b144a7bc-c1a2-4777-bca4-d0c36483851f","Type":"ContainerDied","Data":"ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036"} Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.429924 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-cb69l" podStartSLOduration=3.42990519 podStartE2EDuration="3.42990519s" podCreationTimestamp="2025-10-07 08:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:33:42.429865688 +0000 UTC m=+1029.239179832" watchObservedRunningTime="2025-10-07 08:33:42.42990519 +0000 UTC m=+1029.239219334" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.473865 5025 scope.go:117] "RemoveContainer" containerID="914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4" Oct 07 08:33:42 crc kubenswrapper[5025]: E1007 08:33:42.475591 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4\": container with ID starting with 914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4 not found: ID does not exist" containerID="914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.475624 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4"} err="failed to get container status \"914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4\": rpc error: code = NotFound desc = could not find container \"914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4\": container with ID starting with 914822833fd2ce915331471b81cb8b2a07846b7fe60d4b4e271ff7572d9819c4 not found: ID does not exist" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.479030 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-z8jm7"] Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.485934 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-z8jm7"] Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.588628 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 08:33:42 crc kubenswrapper[5025]: E1007 08:33:42.671640 5025 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 07 08:33:42 crc kubenswrapper[5025]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b144a7bc-c1a2-4777-bca4-d0c36483851f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 07 08:33:42 crc kubenswrapper[5025]: > podSandboxID="05845f2d16fb8d9c2e155c711a9c49bc69c6641c46939cad7dee7c049bf68f28" Oct 07 08:33:42 crc kubenswrapper[5025]: E1007 08:33:42.671812 5025 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 07 08:33:42 crc kubenswrapper[5025]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgmx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-xw2wv_openstack(b144a7bc-c1a2-4777-bca4-d0c36483851f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b144a7bc-c1a2-4777-bca4-d0c36483851f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 07 08:33:42 crc kubenswrapper[5025]: > logger="UnhandledError" Oct 07 08:33:42 crc kubenswrapper[5025]: E1007 08:33:42.673226 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b144a7bc-c1a2-4777-bca4-d0c36483851f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8554648995-xw2wv" podUID="b144a7bc-c1a2-4777-bca4-d0c36483851f" Oct 07 08:33:42 crc kubenswrapper[5025]: I1007 08:33:42.874271 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-czj54"] Oct 07 08:33:42 crc kubenswrapper[5025]: W1007 08:33:42.882990 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4520ec77_3c49_4527_a716_90a88f6ec243.slice/crio-e0ebdda49d1e96ab7d58629c6b91113688bce2b1d8ccc5bccf83c9b335bf24e4 WatchSource:0}: Error finding container e0ebdda49d1e96ab7d58629c6b91113688bce2b1d8ccc5bccf83c9b335bf24e4: Status 404 returned error can't find the container with id e0ebdda49d1e96ab7d58629c6b91113688bce2b1d8ccc5bccf83c9b335bf24e4 Oct 07 08:33:43 crc kubenswrapper[5025]: E1007 08:33:43.220669 5025 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 08:33:43 crc kubenswrapper[5025]: E1007 08:33:43.220979 5025 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 08:33:43 crc kubenswrapper[5025]: E1007 08:33:43.221030 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift podName:9d7aa421-d9db-4ba6-882a-432d2e8b840f nodeName:}" failed. No retries permitted until 2025-10-07 08:33:45.221015078 +0000 UTC m=+1032.030329222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift") pod "swift-storage-0" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f") : configmap "swift-ring-files" not found Oct 07 08:33:43 crc kubenswrapper[5025]: I1007 08:33:43.221385 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:43 crc kubenswrapper[5025]: I1007 08:33:43.406251 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-czj54" event={"ID":"4520ec77-3c49-4527-a716-90a88f6ec243","Type":"ContainerStarted","Data":"e0ebdda49d1e96ab7d58629c6b91113688bce2b1d8ccc5bccf83c9b335bf24e4"} Oct 07 08:33:43 crc kubenswrapper[5025]: I1007 08:33:43.410859 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" event={"ID":"9ec748e5-296b-4c72-979f-bf49a6dceb02","Type":"ContainerStarted","Data":"74f7937aae91a27bf6b4f33470428498e55ace843ddec617a1ada5e8d5040a1e"} Oct 07 08:33:43 crc kubenswrapper[5025]: I1007 08:33:43.410921 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:43 crc kubenswrapper[5025]: I1007 08:33:43.414606 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dd481497-9604-41b6-918a-a8ec9fd6ad92","Type":"ContainerStarted","Data":"676f8f36d75f61b1116c182d98be523441bd0a5a39ef78e376a8037326db1f1e"} Oct 07 08:33:43 crc kubenswrapper[5025]: I1007 08:33:43.430966 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" podStartSLOduration=3.430947356 podStartE2EDuration="3.430947356s" podCreationTimestamp="2025-10-07 08:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:33:43.429694878 +0000 UTC m=+1030.239009032" watchObservedRunningTime="2025-10-07 08:33:43.430947356 +0000 UTC m=+1030.240261500" Oct 07 08:33:43 crc kubenswrapper[5025]: I1007 08:33:43.939078 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d21b8ac-abd6-4aed-bf55-26aed66f02db" path="/var/lib/kubelet/pods/3d21b8ac-abd6-4aed-bf55-26aed66f02db/volumes" Oct 07 08:33:44 crc kubenswrapper[5025]: I1007 08:33:44.436081 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xw2wv" event={"ID":"b144a7bc-c1a2-4777-bca4-d0c36483851f","Type":"ContainerStarted","Data":"52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b"} Oct 07 08:33:44 crc kubenswrapper[5025]: I1007 08:33:44.436754 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:44 crc kubenswrapper[5025]: I1007 08:33:44.438630 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dd481497-9604-41b6-918a-a8ec9fd6ad92","Type":"ContainerStarted","Data":"abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede"} Oct 07 08:33:44 crc kubenswrapper[5025]: I1007 08:33:44.438960 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dd481497-9604-41b6-918a-a8ec9fd6ad92","Type":"ContainerStarted","Data":"ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7"} Oct 07 08:33:44 crc kubenswrapper[5025]: I1007 08:33:44.438990 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 07 08:33:44 crc kubenswrapper[5025]: I1007 08:33:44.458938 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-xw2wv" podStartSLOduration=4.458922353 podStartE2EDuration="4.458922353s" podCreationTimestamp="2025-10-07 08:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:33:44.456488067 +0000 UTC m=+1031.265802231" watchObservedRunningTime="2025-10-07 08:33:44.458922353 +0000 UTC m=+1031.268236497" Oct 07 08:33:44 crc kubenswrapper[5025]: I1007 08:33:44.486363 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.426742195 podStartE2EDuration="3.486341569s" podCreationTimestamp="2025-10-07 08:33:41 +0000 UTC" firstStartedPulling="2025-10-07 08:33:42.618232564 +0000 UTC m=+1029.427546708" lastFinishedPulling="2025-10-07 08:33:43.677831928 +0000 UTC m=+1030.487146082" observedRunningTime="2025-10-07 08:33:44.476694668 +0000 UTC m=+1031.286008812" watchObservedRunningTime="2025-10-07 08:33:44.486341569 +0000 UTC m=+1031.295655713" Oct 07 08:33:45 crc kubenswrapper[5025]: I1007 08:33:45.262254 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:45 crc kubenswrapper[5025]: E1007 08:33:45.262589 5025 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 08:33:45 crc kubenswrapper[5025]: E1007 08:33:45.262612 5025 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 08:33:45 crc kubenswrapper[5025]: E1007 08:33:45.262666 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift podName:9d7aa421-d9db-4ba6-882a-432d2e8b840f nodeName:}" failed. No retries permitted until 2025-10-07 08:33:49.262646396 +0000 UTC m=+1036.071960540 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift") pod "swift-storage-0" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f") : configmap "swift-ring-files" not found Oct 07 08:33:48 crc kubenswrapper[5025]: I1007 08:33:48.329125 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 07 08:33:48 crc kubenswrapper[5025]: I1007 08:33:48.329624 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 07 08:33:48 crc kubenswrapper[5025]: I1007 08:33:48.408366 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 07 08:33:48 crc kubenswrapper[5025]: I1007 08:33:48.459058 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:48 crc kubenswrapper[5025]: I1007 08:33:48.459120 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:48 crc kubenswrapper[5025]: I1007 08:33:48.468263 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-czj54" event={"ID":"4520ec77-3c49-4527-a716-90a88f6ec243","Type":"ContainerStarted","Data":"623ad4cfbe31a78123038509fd96a2cd137909f0a4c517859ab31753d48f5f79"} Oct 07 08:33:48 crc kubenswrapper[5025]: I1007 08:33:48.512295 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 07 08:33:48 crc kubenswrapper[5025]: I1007 08:33:48.524899 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.045152 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lcwqv"] Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.046387 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lcwqv" Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.067858 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lcwqv"] Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.162838 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j762\" (UniqueName: \"kubernetes.io/projected/afea2d9f-2278-427e-a2aa-5f034f08a1bd-kube-api-access-2j762\") pod \"glance-db-create-lcwqv\" (UID: \"afea2d9f-2278-427e-a2aa-5f034f08a1bd\") " pod="openstack/glance-db-create-lcwqv" Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.264327 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j762\" (UniqueName: \"kubernetes.io/projected/afea2d9f-2278-427e-a2aa-5f034f08a1bd-kube-api-access-2j762\") pod \"glance-db-create-lcwqv\" (UID: \"afea2d9f-2278-427e-a2aa-5f034f08a1bd\") " pod="openstack/glance-db-create-lcwqv" Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.264443 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:49 crc kubenswrapper[5025]: E1007 08:33:49.264600 5025 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 08:33:49 crc kubenswrapper[5025]: E1007 08:33:49.264613 5025 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 08:33:49 crc kubenswrapper[5025]: E1007 08:33:49.264647 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift podName:9d7aa421-d9db-4ba6-882a-432d2e8b840f nodeName:}" failed. No retries permitted until 2025-10-07 08:33:57.264635326 +0000 UTC m=+1044.073949470 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift") pod "swift-storage-0" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f") : configmap "swift-ring-files" not found Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.288321 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j762\" (UniqueName: \"kubernetes.io/projected/afea2d9f-2278-427e-a2aa-5f034f08a1bd-kube-api-access-2j762\") pod \"glance-db-create-lcwqv\" (UID: \"afea2d9f-2278-427e-a2aa-5f034f08a1bd\") " pod="openstack/glance-db-create-lcwqv" Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.360510 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lcwqv" Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.578788 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.616403 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-czj54" podStartSLOduration=3.905326639 podStartE2EDuration="8.616383948s" podCreationTimestamp="2025-10-07 08:33:41 +0000 UTC" firstStartedPulling="2025-10-07 08:33:42.886838914 +0000 UTC m=+1029.696153058" lastFinishedPulling="2025-10-07 08:33:47.597896223 +0000 UTC m=+1034.407210367" observedRunningTime="2025-10-07 08:33:49.531689976 +0000 UTC m=+1036.341004120" watchObservedRunningTime="2025-10-07 08:33:49.616383948 +0000 UTC m=+1036.425698092" Oct 07 08:33:49 crc kubenswrapper[5025]: I1007 08:33:49.831307 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lcwqv"] Oct 07 08:33:50 crc kubenswrapper[5025]: I1007 08:33:50.462681 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:50 crc kubenswrapper[5025]: I1007 08:33:50.524978 5025 generic.go:334] "Generic (PLEG): container finished" podID="afea2d9f-2278-427e-a2aa-5f034f08a1bd" containerID="ba6d5fae88b902587605c5a5955ad73f4e04cfb1fb6234c0f017dfcfe8efd280" exitCode=0 Oct 07 08:33:50 crc kubenswrapper[5025]: I1007 08:33:50.525077 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lcwqv" event={"ID":"afea2d9f-2278-427e-a2aa-5f034f08a1bd","Type":"ContainerDied","Data":"ba6d5fae88b902587605c5a5955ad73f4e04cfb1fb6234c0f017dfcfe8efd280"} Oct 07 08:33:50 crc kubenswrapper[5025]: I1007 08:33:50.525138 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lcwqv" event={"ID":"afea2d9f-2278-427e-a2aa-5f034f08a1bd","Type":"ContainerStarted","Data":"5fabab24b43444f4f18440b28e13b4169554390f38f1d81a8a3581a0b553dc3f"} Oct 07 08:33:50 crc kubenswrapper[5025]: I1007 08:33:50.792166 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:33:50 crc kubenswrapper[5025]: I1007 08:33:50.853790 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xw2wv"] Oct 07 08:33:50 crc kubenswrapper[5025]: I1007 08:33:50.859166 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-xw2wv" podUID="b144a7bc-c1a2-4777-bca4-d0c36483851f" containerName="dnsmasq-dns" containerID="cri-o://52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b" gracePeriod=10 Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.312076 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.498674 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-nb\") pod \"b144a7bc-c1a2-4777-bca4-d0c36483851f\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.499663 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-dns-svc\") pod \"b144a7bc-c1a2-4777-bca4-d0c36483851f\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.499789 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-sb\") pod \"b144a7bc-c1a2-4777-bca4-d0c36483851f\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.499967 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgmx7\" (UniqueName: \"kubernetes.io/projected/b144a7bc-c1a2-4777-bca4-d0c36483851f-kube-api-access-sgmx7\") pod \"b144a7bc-c1a2-4777-bca4-d0c36483851f\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.500130 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-config\") pod \"b144a7bc-c1a2-4777-bca4-d0c36483851f\" (UID: \"b144a7bc-c1a2-4777-bca4-d0c36483851f\") " Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.514847 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b144a7bc-c1a2-4777-bca4-d0c36483851f-kube-api-access-sgmx7" (OuterVolumeSpecName: "kube-api-access-sgmx7") pod "b144a7bc-c1a2-4777-bca4-d0c36483851f" (UID: "b144a7bc-c1a2-4777-bca4-d0c36483851f"). InnerVolumeSpecName "kube-api-access-sgmx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.537242 5025 generic.go:334] "Generic (PLEG): container finished" podID="b144a7bc-c1a2-4777-bca4-d0c36483851f" containerID="52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b" exitCode=0 Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.537296 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xw2wv" event={"ID":"b144a7bc-c1a2-4777-bca4-d0c36483851f","Type":"ContainerDied","Data":"52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b"} Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.537828 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xw2wv" event={"ID":"b144a7bc-c1a2-4777-bca4-d0c36483851f","Type":"ContainerDied","Data":"05845f2d16fb8d9c2e155c711a9c49bc69c6641c46939cad7dee7c049bf68f28"} Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.537309 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-xw2wv" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.537861 5025 scope.go:117] "RemoveContainer" containerID="52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.550805 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b144a7bc-c1a2-4777-bca4-d0c36483851f" (UID: "b144a7bc-c1a2-4777-bca4-d0c36483851f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.555748 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b144a7bc-c1a2-4777-bca4-d0c36483851f" (UID: "b144a7bc-c1a2-4777-bca4-d0c36483851f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.556293 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b144a7bc-c1a2-4777-bca4-d0c36483851f" (UID: "b144a7bc-c1a2-4777-bca4-d0c36483851f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.571968 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-config" (OuterVolumeSpecName: "config") pod "b144a7bc-c1a2-4777-bca4-d0c36483851f" (UID: "b144a7bc-c1a2-4777-bca4-d0c36483851f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.601828 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.601860 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.601869 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.601877 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b144a7bc-c1a2-4777-bca4-d0c36483851f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.601887 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgmx7\" (UniqueName: \"kubernetes.io/projected/b144a7bc-c1a2-4777-bca4-d0c36483851f-kube-api-access-sgmx7\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.642389 5025 scope.go:117] "RemoveContainer" containerID="ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.664337 5025 scope.go:117] "RemoveContainer" containerID="52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b" Oct 07 08:33:51 crc kubenswrapper[5025]: E1007 08:33:51.665404 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b\": container with ID starting with 52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b not found: ID does not exist" containerID="52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.665440 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b"} err="failed to get container status \"52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b\": rpc error: code = NotFound desc = could not find container \"52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b\": container with ID starting with 52d97614357a83b19267bf6691880647644bab472ff475627d1838da5adf4d9b not found: ID does not exist" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.665467 5025 scope.go:117] "RemoveContainer" containerID="ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036" Oct 07 08:33:51 crc kubenswrapper[5025]: E1007 08:33:51.668315 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036\": container with ID starting with ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036 not found: ID does not exist" containerID="ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.668356 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036"} err="failed to get container status \"ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036\": rpc error: code = NotFound desc = could not find container \"ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036\": container with ID starting with ba1867fcb4ec3ebf369c24bcb9ef7121472d19bf0389318e8ea1d129d69f4036 not found: ID does not exist" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.758577 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lcwqv" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.893399 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xw2wv"] Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.893456 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xw2wv"] Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.906065 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j762\" (UniqueName: \"kubernetes.io/projected/afea2d9f-2278-427e-a2aa-5f034f08a1bd-kube-api-access-2j762\") pod \"afea2d9f-2278-427e-a2aa-5f034f08a1bd\" (UID: \"afea2d9f-2278-427e-a2aa-5f034f08a1bd\") " Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.908860 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afea2d9f-2278-427e-a2aa-5f034f08a1bd-kube-api-access-2j762" (OuterVolumeSpecName: "kube-api-access-2j762") pod "afea2d9f-2278-427e-a2aa-5f034f08a1bd" (UID: "afea2d9f-2278-427e-a2aa-5f034f08a1bd"). InnerVolumeSpecName "kube-api-access-2j762". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:33:51 crc kubenswrapper[5025]: I1007 08:33:51.924710 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b144a7bc-c1a2-4777-bca4-d0c36483851f" path="/var/lib/kubelet/pods/b144a7bc-c1a2-4777-bca4-d0c36483851f/volumes" Oct 07 08:33:52 crc kubenswrapper[5025]: I1007 08:33:52.008304 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j762\" (UniqueName: \"kubernetes.io/projected/afea2d9f-2278-427e-a2aa-5f034f08a1bd-kube-api-access-2j762\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:52 crc kubenswrapper[5025]: I1007 08:33:52.545610 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lcwqv" Oct 07 08:33:52 crc kubenswrapper[5025]: I1007 08:33:52.545685 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lcwqv" event={"ID":"afea2d9f-2278-427e-a2aa-5f034f08a1bd","Type":"ContainerDied","Data":"5fabab24b43444f4f18440b28e13b4169554390f38f1d81a8a3581a0b553dc3f"} Oct 07 08:33:52 crc kubenswrapper[5025]: I1007 08:33:52.545972 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fabab24b43444f4f18440b28e13b4169554390f38f1d81a8a3581a0b553dc3f" Oct 07 08:33:55 crc kubenswrapper[5025]: I1007 08:33:55.576456 5025 generic.go:334] "Generic (PLEG): container finished" podID="4520ec77-3c49-4527-a716-90a88f6ec243" containerID="623ad4cfbe31a78123038509fd96a2cd137909f0a4c517859ab31753d48f5f79" exitCode=0 Oct 07 08:33:55 crc kubenswrapper[5025]: I1007 08:33:55.576757 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-czj54" event={"ID":"4520ec77-3c49-4527-a716-90a88f6ec243","Type":"ContainerDied","Data":"623ad4cfbe31a78123038509fd96a2cd137909f0a4c517859ab31753d48f5f79"} Oct 07 08:33:55 crc kubenswrapper[5025]: I1007 08:33:55.934594 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:33:55 crc kubenswrapper[5025]: I1007 08:33:55.934700 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:33:55 crc kubenswrapper[5025]: I1007 08:33:55.951853 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:33:55 crc kubenswrapper[5025]: I1007 08:33:55.953794 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c91f90218018dc6579e30a1d0a59af84fe0152b7bc9f23fa2787bc6ffa336d31"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:33:55 crc kubenswrapper[5025]: I1007 08:33:55.954126 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://c91f90218018dc6579e30a1d0a59af84fe0152b7bc9f23fa2787bc6ffa336d31" gracePeriod=600 Oct 07 08:33:56 crc kubenswrapper[5025]: I1007 08:33:56.601041 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="c91f90218018dc6579e30a1d0a59af84fe0152b7bc9f23fa2787bc6ffa336d31" exitCode=0 Oct 07 08:33:56 crc kubenswrapper[5025]: I1007 08:33:56.601214 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"c91f90218018dc6579e30a1d0a59af84fe0152b7bc9f23fa2787bc6ffa336d31"} Oct 07 08:33:56 crc kubenswrapper[5025]: I1007 08:33:56.601780 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"1b24ca81daef9a1bfdd6aba62c02272733ea9a1c7e3d4e5bdd90b9e74defc3eb"} Oct 07 08:33:56 crc kubenswrapper[5025]: I1007 08:33:56.601826 5025 scope.go:117] "RemoveContainer" containerID="e73f6c9680aced0c67b362a72f57135c1bc51cd7c5425767d6e7d7481054ac9b" Oct 07 08:33:56 crc kubenswrapper[5025]: I1007 08:33:56.905192 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.005682 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-dispersionconf\") pod \"4520ec77-3c49-4527-a716-90a88f6ec243\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.005730 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cvhs\" (UniqueName: \"kubernetes.io/projected/4520ec77-3c49-4527-a716-90a88f6ec243-kube-api-access-8cvhs\") pod \"4520ec77-3c49-4527-a716-90a88f6ec243\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.005760 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-combined-ca-bundle\") pod \"4520ec77-3c49-4527-a716-90a88f6ec243\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.005783 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-swiftconf\") pod \"4520ec77-3c49-4527-a716-90a88f6ec243\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.005813 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-ring-data-devices\") pod \"4520ec77-3c49-4527-a716-90a88f6ec243\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.005854 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-scripts\") pod \"4520ec77-3c49-4527-a716-90a88f6ec243\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.005900 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4520ec77-3c49-4527-a716-90a88f6ec243-etc-swift\") pod \"4520ec77-3c49-4527-a716-90a88f6ec243\" (UID: \"4520ec77-3c49-4527-a716-90a88f6ec243\") " Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.007531 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4520ec77-3c49-4527-a716-90a88f6ec243" (UID: "4520ec77-3c49-4527-a716-90a88f6ec243"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.007831 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4520ec77-3c49-4527-a716-90a88f6ec243-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4520ec77-3c49-4527-a716-90a88f6ec243" (UID: "4520ec77-3c49-4527-a716-90a88f6ec243"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.011433 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4520ec77-3c49-4527-a716-90a88f6ec243-kube-api-access-8cvhs" (OuterVolumeSpecName: "kube-api-access-8cvhs") pod "4520ec77-3c49-4527-a716-90a88f6ec243" (UID: "4520ec77-3c49-4527-a716-90a88f6ec243"). InnerVolumeSpecName "kube-api-access-8cvhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.018739 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4520ec77-3c49-4527-a716-90a88f6ec243" (UID: "4520ec77-3c49-4527-a716-90a88f6ec243"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.029129 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-scripts" (OuterVolumeSpecName: "scripts") pod "4520ec77-3c49-4527-a716-90a88f6ec243" (UID: "4520ec77-3c49-4527-a716-90a88f6ec243"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.030340 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4520ec77-3c49-4527-a716-90a88f6ec243" (UID: "4520ec77-3c49-4527-a716-90a88f6ec243"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.055660 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4520ec77-3c49-4527-a716-90a88f6ec243" (UID: "4520ec77-3c49-4527-a716-90a88f6ec243"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.076020 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.108701 5025 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.108751 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cvhs\" (UniqueName: \"kubernetes.io/projected/4520ec77-3c49-4527-a716-90a88f6ec243-kube-api-access-8cvhs\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.108774 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.108792 5025 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4520ec77-3c49-4527-a716-90a88f6ec243-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.108808 5025 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.108824 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4520ec77-3c49-4527-a716-90a88f6ec243-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.108839 5025 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4520ec77-3c49-4527-a716-90a88f6ec243-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.313134 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.318401 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") pod \"swift-storage-0\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " pod="openstack/swift-storage-0" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.377297 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.617323 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-czj54" event={"ID":"4520ec77-3c49-4527-a716-90a88f6ec243","Type":"ContainerDied","Data":"e0ebdda49d1e96ab7d58629c6b91113688bce2b1d8ccc5bccf83c9b335bf24e4"} Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.617683 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-czj54" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.617704 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ebdda49d1e96ab7d58629c6b91113688bce2b1d8ccc5bccf83c9b335bf24e4" Oct 07 08:33:57 crc kubenswrapper[5025]: I1007 08:33:57.954023 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.261345 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gzznk"] Oct 07 08:33:58 crc kubenswrapper[5025]: E1007 08:33:58.262223 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4520ec77-3c49-4527-a716-90a88f6ec243" containerName="swift-ring-rebalance" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.262255 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4520ec77-3c49-4527-a716-90a88f6ec243" containerName="swift-ring-rebalance" Oct 07 08:33:58 crc kubenswrapper[5025]: E1007 08:33:58.262267 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afea2d9f-2278-427e-a2aa-5f034f08a1bd" containerName="mariadb-database-create" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.262276 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="afea2d9f-2278-427e-a2aa-5f034f08a1bd" containerName="mariadb-database-create" Oct 07 08:33:58 crc kubenswrapper[5025]: E1007 08:33:58.262288 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b144a7bc-c1a2-4777-bca4-d0c36483851f" containerName="init" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.262297 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="b144a7bc-c1a2-4777-bca4-d0c36483851f" containerName="init" Oct 07 08:33:58 crc kubenswrapper[5025]: E1007 08:33:58.262322 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b144a7bc-c1a2-4777-bca4-d0c36483851f" containerName="dnsmasq-dns" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.262330 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="b144a7bc-c1a2-4777-bca4-d0c36483851f" containerName="dnsmasq-dns" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.262533 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="4520ec77-3c49-4527-a716-90a88f6ec243" containerName="swift-ring-rebalance" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.262574 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="b144a7bc-c1a2-4777-bca4-d0c36483851f" containerName="dnsmasq-dns" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.262592 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="afea2d9f-2278-427e-a2aa-5f034f08a1bd" containerName="mariadb-database-create" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.263250 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gzznk" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.278524 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gzznk"] Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.429332 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vrqk\" (UniqueName: \"kubernetes.io/projected/9a69558c-23d2-4755-95a7-0497a3baec0c-kube-api-access-6vrqk\") pod \"keystone-db-create-gzznk\" (UID: \"9a69558c-23d2-4755-95a7-0497a3baec0c\") " pod="openstack/keystone-db-create-gzznk" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.530693 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vrqk\" (UniqueName: \"kubernetes.io/projected/9a69558c-23d2-4755-95a7-0497a3baec0c-kube-api-access-6vrqk\") pod \"keystone-db-create-gzznk\" (UID: \"9a69558c-23d2-4755-95a7-0497a3baec0c\") " pod="openstack/keystone-db-create-gzznk" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.551428 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vrqk\" (UniqueName: \"kubernetes.io/projected/9a69558c-23d2-4755-95a7-0497a3baec0c-kube-api-access-6vrqk\") pod \"keystone-db-create-gzznk\" (UID: \"9a69558c-23d2-4755-95a7-0497a3baec0c\") " pod="openstack/keystone-db-create-gzznk" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.580118 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gzznk" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.593179 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-znsxf"] Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.594244 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-znsxf" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.600683 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-znsxf"] Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.627881 5025 generic.go:334] "Generic (PLEG): container finished" podID="d46577dd-b38b-4b80-ad57-577629e648b8" containerID="17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e" exitCode=0 Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.627967 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d46577dd-b38b-4b80-ad57-577629e648b8","Type":"ContainerDied","Data":"17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e"} Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.632723 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vntp8\" (UniqueName: \"kubernetes.io/projected/62224827-cf6b-4de5-b315-92293847f02f-kube-api-access-vntp8\") pod \"placement-db-create-znsxf\" (UID: \"62224827-cf6b-4de5-b315-92293847f02f\") " pod="openstack/placement-db-create-znsxf" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.644909 5025 generic.go:334] "Generic (PLEG): container finished" podID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" containerID="058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115" exitCode=0 Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.644975 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5a4ae5a-0b64-481f-a54d-2263de0eae8e","Type":"ContainerDied","Data":"058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115"} Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.645885 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"06959559e1f7691a3fdc68c518f2fed5c35975781aa36606814617ec5e2bc943"} Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.734025 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vntp8\" (UniqueName: \"kubernetes.io/projected/62224827-cf6b-4de5-b315-92293847f02f-kube-api-access-vntp8\") pod \"placement-db-create-znsxf\" (UID: \"62224827-cf6b-4de5-b315-92293847f02f\") " pod="openstack/placement-db-create-znsxf" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.755479 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vntp8\" (UniqueName: \"kubernetes.io/projected/62224827-cf6b-4de5-b315-92293847f02f-kube-api-access-vntp8\") pod \"placement-db-create-znsxf\" (UID: \"62224827-cf6b-4de5-b315-92293847f02f\") " pod="openstack/placement-db-create-znsxf" Oct 07 08:33:58 crc kubenswrapper[5025]: I1007 08:33:58.758228 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-znsxf" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.007157 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-znsxf"] Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.049322 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gzznk"] Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.068111 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4544-account-create-5pwmj"] Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.069196 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4544-account-create-5pwmj" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.077199 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.079485 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4544-account-create-5pwmj"] Oct 07 08:33:59 crc kubenswrapper[5025]: W1007 08:33:59.200046 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62224827_cf6b_4de5_b315_92293847f02f.slice/crio-cf2a6ea214841482a2d33b7238cf82f8fdc29f194a15e4eee1d8b7b57ba51199 WatchSource:0}: Error finding container cf2a6ea214841482a2d33b7238cf82f8fdc29f194a15e4eee1d8b7b57ba51199: Status 404 returned error can't find the container with id cf2a6ea214841482a2d33b7238cf82f8fdc29f194a15e4eee1d8b7b57ba51199 Oct 07 08:33:59 crc kubenswrapper[5025]: W1007 08:33:59.200520 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a69558c_23d2_4755_95a7_0497a3baec0c.slice/crio-20b7147d88c0ad8437fc5c3870f210aa4cfd1bcd7592f4650a85ddb3e2bd0349 WatchSource:0}: Error finding container 20b7147d88c0ad8437fc5c3870f210aa4cfd1bcd7592f4650a85ddb3e2bd0349: Status 404 returned error can't find the container with id 20b7147d88c0ad8437fc5c3870f210aa4cfd1bcd7592f4650a85ddb3e2bd0349 Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.242371 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrk6\" (UniqueName: \"kubernetes.io/projected/792101f8-2389-423a-b5a9-f10d2e141f0d-kube-api-access-8wrk6\") pod \"glance-4544-account-create-5pwmj\" (UID: \"792101f8-2389-423a-b5a9-f10d2e141f0d\") " pod="openstack/glance-4544-account-create-5pwmj" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.344236 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wrk6\" (UniqueName: \"kubernetes.io/projected/792101f8-2389-423a-b5a9-f10d2e141f0d-kube-api-access-8wrk6\") pod \"glance-4544-account-create-5pwmj\" (UID: \"792101f8-2389-423a-b5a9-f10d2e141f0d\") " pod="openstack/glance-4544-account-create-5pwmj" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.369913 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wrk6\" (UniqueName: \"kubernetes.io/projected/792101f8-2389-423a-b5a9-f10d2e141f0d-kube-api-access-8wrk6\") pod \"glance-4544-account-create-5pwmj\" (UID: \"792101f8-2389-423a-b5a9-f10d2e141f0d\") " pod="openstack/glance-4544-account-create-5pwmj" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.384706 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4544-account-create-5pwmj" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.657372 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5a4ae5a-0b64-481f-a54d-2263de0eae8e","Type":"ContainerStarted","Data":"36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef"} Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.657960 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.659863 5025 generic.go:334] "Generic (PLEG): container finished" podID="9a69558c-23d2-4755-95a7-0497a3baec0c" containerID="6562b6087c3ad21549c75a1f9f5c20930c4a9243c932457cb82207595e4442e2" exitCode=0 Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.659959 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gzznk" event={"ID":"9a69558c-23d2-4755-95a7-0497a3baec0c","Type":"ContainerDied","Data":"6562b6087c3ad21549c75a1f9f5c20930c4a9243c932457cb82207595e4442e2"} Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.660005 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gzznk" event={"ID":"9a69558c-23d2-4755-95a7-0497a3baec0c","Type":"ContainerStarted","Data":"20b7147d88c0ad8437fc5c3870f210aa4cfd1bcd7592f4650a85ddb3e2bd0349"} Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.662636 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"4fd99f99ec9c844da88a9f393a6696daa800bfa88f16e03ae533ceadefd9d877"} Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.664888 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d46577dd-b38b-4b80-ad57-577629e648b8","Type":"ContainerStarted","Data":"73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431"} Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.665207 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.668374 5025 generic.go:334] "Generic (PLEG): container finished" podID="62224827-cf6b-4de5-b315-92293847f02f" containerID="bba8208e2d1a5187922ed405a4067ff412a5ef5ece681e295db2104ffdac07b4" exitCode=0 Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.668417 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-znsxf" event={"ID":"62224827-cf6b-4de5-b315-92293847f02f","Type":"ContainerDied","Data":"bba8208e2d1a5187922ed405a4067ff412a5ef5ece681e295db2104ffdac07b4"} Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.668439 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-znsxf" event={"ID":"62224827-cf6b-4de5-b315-92293847f02f","Type":"ContainerStarted","Data":"cf2a6ea214841482a2d33b7238cf82f8fdc29f194a15e4eee1d8b7b57ba51199"} Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.690442 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.844244291 podStartE2EDuration="55.690420582s" podCreationTimestamp="2025-10-07 08:33:04 +0000 UTC" firstStartedPulling="2025-10-07 08:33:10.236169524 +0000 UTC m=+997.045483668" lastFinishedPulling="2025-10-07 08:33:21.082345815 +0000 UTC m=+1007.891659959" observedRunningTime="2025-10-07 08:33:59.684807047 +0000 UTC m=+1046.494121191" watchObservedRunningTime="2025-10-07 08:33:59.690420582 +0000 UTC m=+1046.499734736" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.726645 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.373457079 podStartE2EDuration="56.726624262s" podCreationTimestamp="2025-10-07 08:33:03 +0000 UTC" firstStartedPulling="2025-10-07 08:33:05.673919017 +0000 UTC m=+992.483233161" lastFinishedPulling="2025-10-07 08:33:21.0270862 +0000 UTC m=+1007.836400344" observedRunningTime="2025-10-07 08:33:59.722683079 +0000 UTC m=+1046.531997223" watchObservedRunningTime="2025-10-07 08:33:59.726624262 +0000 UTC m=+1046.535938406" Oct 07 08:33:59 crc kubenswrapper[5025]: I1007 08:33:59.829513 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4544-account-create-5pwmj"] Oct 07 08:34:00 crc kubenswrapper[5025]: I1007 08:34:00.678692 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"8c0c209e0771ee6bccd63b38c1d1aaf2d92866662d731f3cc5eccf0667f3306a"} Oct 07 08:34:00 crc kubenswrapper[5025]: I1007 08:34:00.679000 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"9640e93cbab8bc8d914777065ae812b65e462076e84f96dabb2f52c7a65f973a"} Oct 07 08:34:00 crc kubenswrapper[5025]: I1007 08:34:00.681728 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4544-account-create-5pwmj" event={"ID":"792101f8-2389-423a-b5a9-f10d2e141f0d","Type":"ContainerStarted","Data":"fbf155f64e72192c229eb5829876b2b2a52e4defa95588a3abec0cd6525eb31d"} Oct 07 08:34:00 crc kubenswrapper[5025]: I1007 08:34:00.681800 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4544-account-create-5pwmj" event={"ID":"792101f8-2389-423a-b5a9-f10d2e141f0d","Type":"ContainerStarted","Data":"70deb9ff01a049372593fc2e7a28c2a703ba5383fbe79f324742696c7f648e2e"} Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.085107 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-znsxf" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.089406 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gzznk" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.173647 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vrqk\" (UniqueName: \"kubernetes.io/projected/9a69558c-23d2-4755-95a7-0497a3baec0c-kube-api-access-6vrqk\") pod \"9a69558c-23d2-4755-95a7-0497a3baec0c\" (UID: \"9a69558c-23d2-4755-95a7-0497a3baec0c\") " Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.173750 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vntp8\" (UniqueName: \"kubernetes.io/projected/62224827-cf6b-4de5-b315-92293847f02f-kube-api-access-vntp8\") pod \"62224827-cf6b-4de5-b315-92293847f02f\" (UID: \"62224827-cf6b-4de5-b315-92293847f02f\") " Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.179389 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a69558c-23d2-4755-95a7-0497a3baec0c-kube-api-access-6vrqk" (OuterVolumeSpecName: "kube-api-access-6vrqk") pod "9a69558c-23d2-4755-95a7-0497a3baec0c" (UID: "9a69558c-23d2-4755-95a7-0497a3baec0c"). InnerVolumeSpecName "kube-api-access-6vrqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.189069 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62224827-cf6b-4de5-b315-92293847f02f-kube-api-access-vntp8" (OuterVolumeSpecName: "kube-api-access-vntp8") pod "62224827-cf6b-4de5-b315-92293847f02f" (UID: "62224827-cf6b-4de5-b315-92293847f02f"). InnerVolumeSpecName "kube-api-access-vntp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.276121 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vrqk\" (UniqueName: \"kubernetes.io/projected/9a69558c-23d2-4755-95a7-0497a3baec0c-kube-api-access-6vrqk\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.276152 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vntp8\" (UniqueName: \"kubernetes.io/projected/62224827-cf6b-4de5-b315-92293847f02f-kube-api-access-vntp8\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.729459 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"15d39914545a4a069a9ea2c3664be41c34cbfd52c1e69df82087b4f1a1940cdd"} Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.731281 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-znsxf" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.732002 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-znsxf" event={"ID":"62224827-cf6b-4de5-b315-92293847f02f","Type":"ContainerDied","Data":"cf2a6ea214841482a2d33b7238cf82f8fdc29f194a15e4eee1d8b7b57ba51199"} Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.732602 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf2a6ea214841482a2d33b7238cf82f8fdc29f194a15e4eee1d8b7b57ba51199" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.733401 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gzznk" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.733386 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gzznk" event={"ID":"9a69558c-23d2-4755-95a7-0497a3baec0c","Type":"ContainerDied","Data":"20b7147d88c0ad8437fc5c3870f210aa4cfd1bcd7592f4650a85ddb3e2bd0349"} Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.733623 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20b7147d88c0ad8437fc5c3870f210aa4cfd1bcd7592f4650a85ddb3e2bd0349" Oct 07 08:34:01 crc kubenswrapper[5025]: I1007 08:34:01.768186 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-4544-account-create-5pwmj" podStartSLOduration=2.768167036 podStartE2EDuration="2.768167036s" podCreationTimestamp="2025-10-07 08:33:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:01.766192424 +0000 UTC m=+1048.575506568" watchObservedRunningTime="2025-10-07 08:34:01.768167036 +0000 UTC m=+1048.577481180" Oct 07 08:34:02 crc kubenswrapper[5025]: I1007 08:34:02.742500 5025 generic.go:334] "Generic (PLEG): container finished" podID="792101f8-2389-423a-b5a9-f10d2e141f0d" containerID="fbf155f64e72192c229eb5829876b2b2a52e4defa95588a3abec0cd6525eb31d" exitCode=0 Oct 07 08:34:02 crc kubenswrapper[5025]: I1007 08:34:02.742619 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4544-account-create-5pwmj" event={"ID":"792101f8-2389-423a-b5a9-f10d2e141f0d","Type":"ContainerDied","Data":"fbf155f64e72192c229eb5829876b2b2a52e4defa95588a3abec0cd6525eb31d"} Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.062688 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4544-account-create-5pwmj" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.122066 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wrk6\" (UniqueName: \"kubernetes.io/projected/792101f8-2389-423a-b5a9-f10d2e141f0d-kube-api-access-8wrk6\") pod \"792101f8-2389-423a-b5a9-f10d2e141f0d\" (UID: \"792101f8-2389-423a-b5a9-f10d2e141f0d\") " Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.131741 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792101f8-2389-423a-b5a9-f10d2e141f0d-kube-api-access-8wrk6" (OuterVolumeSpecName: "kube-api-access-8wrk6") pod "792101f8-2389-423a-b5a9-f10d2e141f0d" (UID: "792101f8-2389-423a-b5a9-f10d2e141f0d"). InnerVolumeSpecName "kube-api-access-8wrk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.224633 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wrk6\" (UniqueName: \"kubernetes.io/projected/792101f8-2389-423a-b5a9-f10d2e141f0d-kube-api-access-8wrk6\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.427689 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bxmbm" podUID="d6889a88-68ef-4bf3-9e7e-78c6d84785ae" containerName="ovn-controller" probeResult="failure" output=< Oct 07 08:34:04 crc kubenswrapper[5025]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 08:34:04 crc kubenswrapper[5025]: > Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.503111 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.503176 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.741650 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bxmbm-config-rshcn"] Oct 07 08:34:04 crc kubenswrapper[5025]: E1007 08:34:04.742906 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a69558c-23d2-4755-95a7-0497a3baec0c" containerName="mariadb-database-create" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.742931 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a69558c-23d2-4755-95a7-0497a3baec0c" containerName="mariadb-database-create" Oct 07 08:34:04 crc kubenswrapper[5025]: E1007 08:34:04.742947 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62224827-cf6b-4de5-b315-92293847f02f" containerName="mariadb-database-create" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.742955 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="62224827-cf6b-4de5-b315-92293847f02f" containerName="mariadb-database-create" Oct 07 08:34:04 crc kubenswrapper[5025]: E1007 08:34:04.742981 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792101f8-2389-423a-b5a9-f10d2e141f0d" containerName="mariadb-account-create" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.742992 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="792101f8-2389-423a-b5a9-f10d2e141f0d" containerName="mariadb-account-create" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.757266 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="62224827-cf6b-4de5-b315-92293847f02f" containerName="mariadb-database-create" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.757348 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="792101f8-2389-423a-b5a9-f10d2e141f0d" containerName="mariadb-account-create" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.757389 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a69558c-23d2-4755-95a7-0497a3baec0c" containerName="mariadb-database-create" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.758631 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.766835 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.777481 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bxmbm-config-rshcn"] Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.822654 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"76ba63d82ad08b0153c89278b305db830f608f9bed7c534bb409b72c53c3a06c"} Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.822715 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"266ae7663ff2f3c8ca4fdece1db841dc278426b4fca4509d0d0233a384d325dd"} Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.822729 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"49443cb71b40bc6abe49e9f54690d77bd06b9d3970bc17a852e8a4e30f9ba6b7"} Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.822739 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"c93808ebcf734eeb951e224e6dd45872e982d63f46ec041bfc4d33193be2b705"} Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.830665 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4544-account-create-5pwmj" event={"ID":"792101f8-2389-423a-b5a9-f10d2e141f0d","Type":"ContainerDied","Data":"70deb9ff01a049372593fc2e7a28c2a703ba5383fbe79f324742696c7f648e2e"} Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.830728 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70deb9ff01a049372593fc2e7a28c2a703ba5383fbe79f324742696c7f648e2e" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.830834 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4544-account-create-5pwmj" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.841166 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm45h\" (UniqueName: \"kubernetes.io/projected/8c859244-5504-4860-b857-42183b22f02a-kube-api-access-mm45h\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.841244 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-additional-scripts\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.841280 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-scripts\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.841678 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-log-ovn\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.841914 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run-ovn\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.842309 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.943305 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.943364 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm45h\" (UniqueName: \"kubernetes.io/projected/8c859244-5504-4860-b857-42183b22f02a-kube-api-access-mm45h\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.943388 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-additional-scripts\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.943407 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-scripts\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.943463 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-log-ovn\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.943489 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run-ovn\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.943696 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run-ovn\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.943705 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.943752 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-log-ovn\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.944473 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-additional-scripts\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.945677 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-scripts\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:04 crc kubenswrapper[5025]: I1007 08:34:04.968749 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm45h\" (UniqueName: \"kubernetes.io/projected/8c859244-5504-4860-b857-42183b22f02a-kube-api-access-mm45h\") pod \"ovn-controller-bxmbm-config-rshcn\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:05 crc kubenswrapper[5025]: I1007 08:34:05.101673 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:05 crc kubenswrapper[5025]: I1007 08:34:05.425749 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bxmbm-config-rshcn"] Oct 07 08:34:05 crc kubenswrapper[5025]: I1007 08:34:05.838607 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm-config-rshcn" event={"ID":"8c859244-5504-4860-b857-42183b22f02a","Type":"ContainerStarted","Data":"4ae8406675d0d507373d46e28498fc2f978edf6ad339490e4cd0940feb93a36a"} Oct 07 08:34:05 crc kubenswrapper[5025]: I1007 08:34:05.838903 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm-config-rshcn" event={"ID":"8c859244-5504-4860-b857-42183b22f02a","Type":"ContainerStarted","Data":"bd6670b9885c57ec219e025f3b8198720d7c0c4674abd4bd8bd5131cbb72f755"} Oct 07 08:34:05 crc kubenswrapper[5025]: I1007 08:34:05.855383 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bxmbm-config-rshcn" podStartSLOduration=1.855363164 podStartE2EDuration="1.855363164s" podCreationTimestamp="2025-10-07 08:34:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:05.853165435 +0000 UTC m=+1052.662479599" watchObservedRunningTime="2025-10-07 08:34:05.855363164 +0000 UTC m=+1052.664677308" Oct 07 08:34:06 crc kubenswrapper[5025]: I1007 08:34:06.850570 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"aff21227028210b4f08d1a87f2fe84e1d162230abc5f64cc544641bfc6120b15"} Oct 07 08:34:06 crc kubenswrapper[5025]: I1007 08:34:06.850915 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"5f05c90b85b312fbeb24954f3c57716c7cd5932e4f7ceedff1daaffd861ae631"} Oct 07 08:34:06 crc kubenswrapper[5025]: I1007 08:34:06.852398 5025 generic.go:334] "Generic (PLEG): container finished" podID="8c859244-5504-4860-b857-42183b22f02a" containerID="4ae8406675d0d507373d46e28498fc2f978edf6ad339490e4cd0940feb93a36a" exitCode=0 Oct 07 08:34:06 crc kubenswrapper[5025]: I1007 08:34:06.852426 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm-config-rshcn" event={"ID":"8c859244-5504-4860-b857-42183b22f02a","Type":"ContainerDied","Data":"4ae8406675d0d507373d46e28498fc2f978edf6ad339490e4cd0940feb93a36a"} Oct 07 08:34:07 crc kubenswrapper[5025]: I1007 08:34:07.869888 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"9d3a9df9333411c34fcd1b1b33c246c5dc54f969d8f7b94b07b7feea250b14fb"} Oct 07 08:34:07 crc kubenswrapper[5025]: I1007 08:34:07.870309 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"a89cc0bbd617b24a619834b36ce895c374cd0e5df0288f882c9728d9252cd609"} Oct 07 08:34:07 crc kubenswrapper[5025]: I1007 08:34:07.870339 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"17e6f1244bd23756c57aa4df7196524e152e5a068f7ec5464a9d47135f222dc7"} Oct 07 08:34:07 crc kubenswrapper[5025]: I1007 08:34:07.870364 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"99ce4f2eebf24b560f31691c8f1a2da1a5df6b8c8ed3d6626051ed204aea839a"} Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.191703 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.305851 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-scripts\") pod \"8c859244-5504-4860-b857-42183b22f02a\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.305941 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-additional-scripts\") pod \"8c859244-5504-4860-b857-42183b22f02a\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.305975 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run-ovn\") pod \"8c859244-5504-4860-b857-42183b22f02a\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.306026 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm45h\" (UniqueName: \"kubernetes.io/projected/8c859244-5504-4860-b857-42183b22f02a-kube-api-access-mm45h\") pod \"8c859244-5504-4860-b857-42183b22f02a\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.306066 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run\") pod \"8c859244-5504-4860-b857-42183b22f02a\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.306083 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-log-ovn\") pod \"8c859244-5504-4860-b857-42183b22f02a\" (UID: \"8c859244-5504-4860-b857-42183b22f02a\") " Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.306452 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8c859244-5504-4860-b857-42183b22f02a" (UID: "8c859244-5504-4860-b857-42183b22f02a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.307516 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8c859244-5504-4860-b857-42183b22f02a" (UID: "8c859244-5504-4860-b857-42183b22f02a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.307546 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-scripts" (OuterVolumeSpecName: "scripts") pod "8c859244-5504-4860-b857-42183b22f02a" (UID: "8c859244-5504-4860-b857-42183b22f02a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.307620 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run" (OuterVolumeSpecName: "var-run") pod "8c859244-5504-4860-b857-42183b22f02a" (UID: "8c859244-5504-4860-b857-42183b22f02a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.307653 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8c859244-5504-4860-b857-42183b22f02a" (UID: "8c859244-5504-4860-b857-42183b22f02a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.312752 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c859244-5504-4860-b857-42183b22f02a-kube-api-access-mm45h" (OuterVolumeSpecName: "kube-api-access-mm45h") pod "8c859244-5504-4860-b857-42183b22f02a" (UID: "8c859244-5504-4860-b857-42183b22f02a"). InnerVolumeSpecName "kube-api-access-mm45h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.408546 5025 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.408605 5025 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.408618 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm45h\" (UniqueName: \"kubernetes.io/projected/8c859244-5504-4860-b857-42183b22f02a-kube-api-access-mm45h\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.408630 5025 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.408644 5025 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c859244-5504-4860-b857-42183b22f02a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.408653 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c859244-5504-4860-b857-42183b22f02a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.412937 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1b46-account-create-lgjrd"] Oct 07 08:34:08 crc kubenswrapper[5025]: E1007 08:34:08.414013 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c859244-5504-4860-b857-42183b22f02a" containerName="ovn-config" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.414037 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c859244-5504-4860-b857-42183b22f02a" containerName="ovn-config" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.414230 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c859244-5504-4860-b857-42183b22f02a" containerName="ovn-config" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.415104 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1b46-account-create-lgjrd" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.420339 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1b46-account-create-lgjrd"] Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.421921 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.509948 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb67s\" (UniqueName: \"kubernetes.io/projected/e38deef2-87ec-483e-84dd-e1ef8822202b-kube-api-access-cb67s\") pod \"keystone-1b46-account-create-lgjrd\" (UID: \"e38deef2-87ec-483e-84dd-e1ef8822202b\") " pod="openstack/keystone-1b46-account-create-lgjrd" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.611967 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb67s\" (UniqueName: \"kubernetes.io/projected/e38deef2-87ec-483e-84dd-e1ef8822202b-kube-api-access-cb67s\") pod \"keystone-1b46-account-create-lgjrd\" (UID: \"e38deef2-87ec-483e-84dd-e1ef8822202b\") " pod="openstack/keystone-1b46-account-create-lgjrd" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.627655 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb67s\" (UniqueName: \"kubernetes.io/projected/e38deef2-87ec-483e-84dd-e1ef8822202b-kube-api-access-cb67s\") pod \"keystone-1b46-account-create-lgjrd\" (UID: \"e38deef2-87ec-483e-84dd-e1ef8822202b\") " pod="openstack/keystone-1b46-account-create-lgjrd" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.712197 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5528-account-create-nswld"] Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.713721 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5528-account-create-nswld" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.720709 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.724770 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5528-account-create-nswld"] Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.770680 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1b46-account-create-lgjrd" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.814043 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sf4f\" (UniqueName: \"kubernetes.io/projected/f3d10c21-f6a2-43b1-98da-e882d1320a5f-kube-api-access-2sf4f\") pod \"placement-5528-account-create-nswld\" (UID: \"f3d10c21-f6a2-43b1-98da-e882d1320a5f\") " pod="openstack/placement-5528-account-create-nswld" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.889345 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm-config-rshcn" event={"ID":"8c859244-5504-4860-b857-42183b22f02a","Type":"ContainerDied","Data":"bd6670b9885c57ec219e025f3b8198720d7c0c4674abd4bd8bd5131cbb72f755"} Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.889733 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd6670b9885c57ec219e025f3b8198720d7c0c4674abd4bd8bd5131cbb72f755" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.889763 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm-config-rshcn" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.913140 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerStarted","Data":"2af861f7f90c2cae8a8ce332baaee8d44d3ff01b9e44974ad1dfe79434f2684b"} Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.915292 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sf4f\" (UniqueName: \"kubernetes.io/projected/f3d10c21-f6a2-43b1-98da-e882d1320a5f-kube-api-access-2sf4f\") pod \"placement-5528-account-create-nswld\" (UID: \"f3d10c21-f6a2-43b1-98da-e882d1320a5f\") " pod="openstack/placement-5528-account-create-nswld" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.938879 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sf4f\" (UniqueName: \"kubernetes.io/projected/f3d10c21-f6a2-43b1-98da-e882d1320a5f-kube-api-access-2sf4f\") pod \"placement-5528-account-create-nswld\" (UID: \"f3d10c21-f6a2-43b1-98da-e882d1320a5f\") " pod="openstack/placement-5528-account-create-nswld" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.963750 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.800359307 podStartE2EDuration="28.963727078s" podCreationTimestamp="2025-10-07 08:33:40 +0000 UTC" firstStartedPulling="2025-10-07 08:33:57.974205196 +0000 UTC m=+1044.783519340" lastFinishedPulling="2025-10-07 08:34:06.137572967 +0000 UTC m=+1052.946887111" observedRunningTime="2025-10-07 08:34:08.958661689 +0000 UTC m=+1055.767975853" watchObservedRunningTime="2025-10-07 08:34:08.963727078 +0000 UTC m=+1055.773041222" Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.980958 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bxmbm-config-rshcn"] Oct 07 08:34:08 crc kubenswrapper[5025]: I1007 08:34:08.996326 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bxmbm-config-rshcn"] Oct 07 08:34:09 crc kubenswrapper[5025]: E1007 08:34:09.025073 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c859244_5504_4860_b857_42183b22f02a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c859244_5504_4860_b857_42183b22f02a.slice/crio-bd6670b9885c57ec219e025f3b8198720d7c0c4674abd4bd8bd5131cbb72f755\": RecentStats: unable to find data in memory cache]" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.063441 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bxmbm-config-mwgl9"] Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.066085 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.072523 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.075286 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bxmbm-config-mwgl9"] Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.087956 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5528-account-create-nswld" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.121343 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-log-ovn\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.121405 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-additional-scripts\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.121455 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-scripts\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.121495 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run-ovn\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.121523 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.121572 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65n9x\" (UniqueName: \"kubernetes.io/projected/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-kube-api-access-65n9x\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.223756 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-scripts\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.224000 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run-ovn\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.224039 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.224072 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65n9x\" (UniqueName: \"kubernetes.io/projected/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-kube-api-access-65n9x\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.224187 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-log-ovn\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.224211 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-additional-scripts\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.225404 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-additional-scripts\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.229766 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.229864 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run-ovn\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.229922 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-log-ovn\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.231036 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-scripts\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.245634 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cr52x"] Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.252467 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.257312 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.266818 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cr52x"] Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.278651 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65n9x\" (UniqueName: \"kubernetes.io/projected/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-kube-api-access-65n9x\") pod \"ovn-controller-bxmbm-config-mwgl9\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.302045 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1b46-account-create-lgjrd"] Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.344382 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.344444 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfhnm\" (UniqueName: \"kubernetes.io/projected/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-kube-api-access-bfhnm\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.344487 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.344559 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.344695 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-config\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.344736 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.388895 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.436906 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gdh6c"] Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.437855 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.443077 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qxgpv" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.443341 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.448616 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-config\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.448661 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.448693 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.448715 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfhnm\" (UniqueName: \"kubernetes.io/projected/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-kube-api-access-bfhnm\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.448748 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.448801 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.449839 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.458295 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.466857 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5528-account-create-nswld"] Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.478846 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-config\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.480615 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.487138 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.520669 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gdh6c"] Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.542633 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfhnm\" (UniqueName: \"kubernetes.io/projected/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-kube-api-access-bfhnm\") pod \"dnsmasq-dns-6d5b6d6b67-cr52x\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.551016 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-config-data\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.551079 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-combined-ca-bundle\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.551115 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-db-sync-config-data\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.551154 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c255h\" (UniqueName: \"kubernetes.io/projected/f2c43be6-62b6-4546-af1a-c3e1835da494-kube-api-access-c255h\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.582179 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-bxmbm" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.643777 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.654482 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-config-data\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.654542 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-combined-ca-bundle\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.654613 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-db-sync-config-data\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.655330 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c255h\" (UniqueName: \"kubernetes.io/projected/f2c43be6-62b6-4546-af1a-c3e1835da494-kube-api-access-c255h\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.665399 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-config-data\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.666005 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-combined-ca-bundle\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.666073 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-db-sync-config-data\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.682071 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c255h\" (UniqueName: \"kubernetes.io/projected/f2c43be6-62b6-4546-af1a-c3e1835da494-kube-api-access-c255h\") pod \"glance-db-sync-gdh6c\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.796646 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.926717 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c859244-5504-4860-b857-42183b22f02a" path="/var/lib/kubelet/pods/8c859244-5504-4860-b857-42183b22f02a/volumes" Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.927088 5025 generic.go:334] "Generic (PLEG): container finished" podID="e38deef2-87ec-483e-84dd-e1ef8822202b" containerID="35487aef47de0def75e3382a7a0b3d3512d8dc866a0cacf75aef7c41baabaca9" exitCode=0 Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.927364 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1b46-account-create-lgjrd" event={"ID":"e38deef2-87ec-483e-84dd-e1ef8822202b","Type":"ContainerDied","Data":"35487aef47de0def75e3382a7a0b3d3512d8dc866a0cacf75aef7c41baabaca9"} Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.929927 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1b46-account-create-lgjrd" event={"ID":"e38deef2-87ec-483e-84dd-e1ef8822202b","Type":"ContainerStarted","Data":"d5acefadd1811dea3ab5f1a8edc24ff3bcd983bdecbf1806e12ad0e48a3b96fd"} Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.936264 5025 generic.go:334] "Generic (PLEG): container finished" podID="f3d10c21-f6a2-43b1-98da-e882d1320a5f" containerID="abeeead2c75322aa86d4cafa6fe54a54de8867bb4c28dfa4b2ad7f6eba21afe7" exitCode=0 Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.936324 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5528-account-create-nswld" event={"ID":"f3d10c21-f6a2-43b1-98da-e882d1320a5f","Type":"ContainerDied","Data":"abeeead2c75322aa86d4cafa6fe54a54de8867bb4c28dfa4b2ad7f6eba21afe7"} Oct 07 08:34:09 crc kubenswrapper[5025]: I1007 08:34:09.936350 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5528-account-create-nswld" event={"ID":"f3d10c21-f6a2-43b1-98da-e882d1320a5f","Type":"ContainerStarted","Data":"789b916b1b307b9213a13971c04bc3f166e38105e531540eafdfe1a0d61dd4da"} Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.045316 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bxmbm-config-mwgl9"] Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.142364 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cr52x"] Oct 07 08:34:10 crc kubenswrapper[5025]: W1007 08:34:10.167762 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3ddc8c0_beba_4616_b90e_0c8384cfd4f9.slice/crio-59c4121756e99bb8de5641df07c11411ad87651993857002a1cc14ad9e75efc3 WatchSource:0}: Error finding container 59c4121756e99bb8de5641df07c11411ad87651993857002a1cc14ad9e75efc3: Status 404 returned error can't find the container with id 59c4121756e99bb8de5641df07c11411ad87651993857002a1cc14ad9e75efc3 Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.375483 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gdh6c"] Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.946100 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm-config-mwgl9" event={"ID":"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f","Type":"ContainerStarted","Data":"19747fc3a062a51335fa4145a9a98d5ced6ebfa63f05ed7f31be5e02d828f525"} Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.946147 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm-config-mwgl9" event={"ID":"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f","Type":"ContainerStarted","Data":"d3c56c527ec520c7edafea946c883dcaadedc00c1287fcb047fa08c6bfc469ae"} Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.948966 5025 generic.go:334] "Generic (PLEG): container finished" podID="d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" containerID="2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8" exitCode=0 Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.949051 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" event={"ID":"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9","Type":"ContainerDied","Data":"2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8"} Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.949095 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" event={"ID":"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9","Type":"ContainerStarted","Data":"59c4121756e99bb8de5641df07c11411ad87651993857002a1cc14ad9e75efc3"} Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.952999 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gdh6c" event={"ID":"f2c43be6-62b6-4546-af1a-c3e1835da494","Type":"ContainerStarted","Data":"3a44845fa7f97ff075f94459b1d36200449eb785583169a725ddd1c7600b4d79"} Oct 07 08:34:10 crc kubenswrapper[5025]: I1007 08:34:10.981067 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bxmbm-config-mwgl9" podStartSLOduration=1.981048846 podStartE2EDuration="1.981048846s" podCreationTimestamp="2025-10-07 08:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:10.977153965 +0000 UTC m=+1057.786468119" watchObservedRunningTime="2025-10-07 08:34:10.981048846 +0000 UTC m=+1057.790362990" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.336019 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1b46-account-create-lgjrd" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.352530 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5528-account-create-nswld" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.389705 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sf4f\" (UniqueName: \"kubernetes.io/projected/f3d10c21-f6a2-43b1-98da-e882d1320a5f-kube-api-access-2sf4f\") pod \"f3d10c21-f6a2-43b1-98da-e882d1320a5f\" (UID: \"f3d10c21-f6a2-43b1-98da-e882d1320a5f\") " Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.389923 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb67s\" (UniqueName: \"kubernetes.io/projected/e38deef2-87ec-483e-84dd-e1ef8822202b-kube-api-access-cb67s\") pod \"e38deef2-87ec-483e-84dd-e1ef8822202b\" (UID: \"e38deef2-87ec-483e-84dd-e1ef8822202b\") " Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.395094 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38deef2-87ec-483e-84dd-e1ef8822202b-kube-api-access-cb67s" (OuterVolumeSpecName: "kube-api-access-cb67s") pod "e38deef2-87ec-483e-84dd-e1ef8822202b" (UID: "e38deef2-87ec-483e-84dd-e1ef8822202b"). InnerVolumeSpecName "kube-api-access-cb67s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.412868 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d10c21-f6a2-43b1-98da-e882d1320a5f-kube-api-access-2sf4f" (OuterVolumeSpecName: "kube-api-access-2sf4f") pod "f3d10c21-f6a2-43b1-98da-e882d1320a5f" (UID: "f3d10c21-f6a2-43b1-98da-e882d1320a5f"). InnerVolumeSpecName "kube-api-access-2sf4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.491750 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb67s\" (UniqueName: \"kubernetes.io/projected/e38deef2-87ec-483e-84dd-e1ef8822202b-kube-api-access-cb67s\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.491782 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sf4f\" (UniqueName: \"kubernetes.io/projected/f3d10c21-f6a2-43b1-98da-e882d1320a5f-kube-api-access-2sf4f\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.963272 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" event={"ID":"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9","Type":"ContainerStarted","Data":"7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d"} Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.963643 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.965086 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1b46-account-create-lgjrd" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.965123 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1b46-account-create-lgjrd" event={"ID":"e38deef2-87ec-483e-84dd-e1ef8822202b","Type":"ContainerDied","Data":"d5acefadd1811dea3ab5f1a8edc24ff3bcd983bdecbf1806e12ad0e48a3b96fd"} Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.965150 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5acefadd1811dea3ab5f1a8edc24ff3bcd983bdecbf1806e12ad0e48a3b96fd" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.966691 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5528-account-create-nswld" event={"ID":"f3d10c21-f6a2-43b1-98da-e882d1320a5f","Type":"ContainerDied","Data":"789b916b1b307b9213a13971c04bc3f166e38105e531540eafdfe1a0d61dd4da"} Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.966715 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="789b916b1b307b9213a13971c04bc3f166e38105e531540eafdfe1a0d61dd4da" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.966745 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5528-account-create-nswld" Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.971675 5025 generic.go:334] "Generic (PLEG): container finished" podID="bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" containerID="19747fc3a062a51335fa4145a9a98d5ced6ebfa63f05ed7f31be5e02d828f525" exitCode=0 Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.971721 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm-config-mwgl9" event={"ID":"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f","Type":"ContainerDied","Data":"19747fc3a062a51335fa4145a9a98d5ced6ebfa63f05ed7f31be5e02d828f525"} Oct 07 08:34:11 crc kubenswrapper[5025]: I1007 08:34:11.986231 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" podStartSLOduration=2.986208082 podStartE2EDuration="2.986208082s" podCreationTimestamp="2025-10-07 08:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:11.980716341 +0000 UTC m=+1058.790030515" watchObservedRunningTime="2025-10-07 08:34:11.986208082 +0000 UTC m=+1058.795522216" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.255229 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.346976 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run\") pod \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347037 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-log-ovn\") pod \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347118 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run" (OuterVolumeSpecName: "var-run") pod "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" (UID: "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347161 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" (UID: "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347157 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" (UID: "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347131 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run-ovn\") pod \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347285 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-scripts\") pod \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347364 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65n9x\" (UniqueName: \"kubernetes.io/projected/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-kube-api-access-65n9x\") pod \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347420 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-additional-scripts\") pod \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\" (UID: \"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f\") " Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347874 5025 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347917 5025 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.347929 5025 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.348111 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" (UID: "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.348403 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-scripts" (OuterVolumeSpecName: "scripts") pod "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" (UID: "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.351148 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-kube-api-access-65n9x" (OuterVolumeSpecName: "kube-api-access-65n9x") pod "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" (UID: "bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f"). InnerVolumeSpecName "kube-api-access-65n9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.449427 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65n9x\" (UniqueName: \"kubernetes.io/projected/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-kube-api-access-65n9x\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.449455 5025 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.449465 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.989196 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm-config-mwgl9" event={"ID":"bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f","Type":"ContainerDied","Data":"d3c56c527ec520c7edafea946c883dcaadedc00c1287fcb047fa08c6bfc469ae"} Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.989241 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3c56c527ec520c7edafea946c883dcaadedc00c1287fcb047fa08c6bfc469ae" Oct 07 08:34:13 crc kubenswrapper[5025]: I1007 08:34:13.989299 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm-config-mwgl9" Oct 07 08:34:14 crc kubenswrapper[5025]: I1007 08:34:14.326474 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bxmbm-config-mwgl9"] Oct 07 08:34:14 crc kubenswrapper[5025]: I1007 08:34:14.333451 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bxmbm-config-mwgl9"] Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.106875 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.437661 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lcsnf"] Oct 07 08:34:15 crc kubenswrapper[5025]: E1007 08:34:15.438247 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" containerName="ovn-config" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.438262 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" containerName="ovn-config" Oct 07 08:34:15 crc kubenswrapper[5025]: E1007 08:34:15.438273 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38deef2-87ec-483e-84dd-e1ef8822202b" containerName="mariadb-account-create" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.438279 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38deef2-87ec-483e-84dd-e1ef8822202b" containerName="mariadb-account-create" Oct 07 08:34:15 crc kubenswrapper[5025]: E1007 08:34:15.438293 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d10c21-f6a2-43b1-98da-e882d1320a5f" containerName="mariadb-account-create" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.438299 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d10c21-f6a2-43b1-98da-e882d1320a5f" containerName="mariadb-account-create" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.438462 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38deef2-87ec-483e-84dd-e1ef8822202b" containerName="mariadb-account-create" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.438472 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" containerName="ovn-config" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.438483 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d10c21-f6a2-43b1-98da-e882d1320a5f" containerName="mariadb-account-create" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.439012 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lcsnf" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.451933 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lcsnf"] Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.482291 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzgj7\" (UniqueName: \"kubernetes.io/projected/c065f6f6-5301-4933-8794-606f5b61464e-kube-api-access-qzgj7\") pod \"cinder-db-create-lcsnf\" (UID: \"c065f6f6-5301-4933-8794-606f5b61464e\") " pod="openstack/cinder-db-create-lcsnf" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.484675 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.549982 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cbqnb"] Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.551261 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cbqnb" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.570957 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cbqnb"] Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.584656 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzgj7\" (UniqueName: \"kubernetes.io/projected/c065f6f6-5301-4933-8794-606f5b61464e-kube-api-access-qzgj7\") pod \"cinder-db-create-lcsnf\" (UID: \"c065f6f6-5301-4933-8794-606f5b61464e\") " pod="openstack/cinder-db-create-lcsnf" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.617085 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzgj7\" (UniqueName: \"kubernetes.io/projected/c065f6f6-5301-4933-8794-606f5b61464e-kube-api-access-qzgj7\") pod \"cinder-db-create-lcsnf\" (UID: \"c065f6f6-5301-4933-8794-606f5b61464e\") " pod="openstack/cinder-db-create-lcsnf" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.640377 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-pmhbz"] Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.641725 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pmhbz" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.648352 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-pmhbz"] Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.685775 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfvhs\" (UniqueName: \"kubernetes.io/projected/108122fc-a610-4d25-8945-5e30f770da7f-kube-api-access-rfvhs\") pod \"barbican-db-create-cbqnb\" (UID: \"108122fc-a610-4d25-8945-5e30f770da7f\") " pod="openstack/barbican-db-create-cbqnb" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.685920 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjlp\" (UniqueName: \"kubernetes.io/projected/aa176e96-74ca-47df-9388-08146166a449-kube-api-access-wqjlp\") pod \"neutron-db-create-pmhbz\" (UID: \"aa176e96-74ca-47df-9388-08146166a449\") " pod="openstack/neutron-db-create-pmhbz" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.733944 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vz2wg"] Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.735336 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.743703 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.743925 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.744035 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.744112 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nfvg7" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.745570 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vz2wg"] Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.759679 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lcsnf" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.787397 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-config-data\") pod \"keystone-db-sync-vz2wg\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.787482 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x588p\" (UniqueName: \"kubernetes.io/projected/13598166-30eb-43b2-8a13-2e2ca72f58e9-kube-api-access-x588p\") pod \"keystone-db-sync-vz2wg\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.787556 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqjlp\" (UniqueName: \"kubernetes.io/projected/aa176e96-74ca-47df-9388-08146166a449-kube-api-access-wqjlp\") pod \"neutron-db-create-pmhbz\" (UID: \"aa176e96-74ca-47df-9388-08146166a449\") " pod="openstack/neutron-db-create-pmhbz" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.787627 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-combined-ca-bundle\") pod \"keystone-db-sync-vz2wg\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.787695 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfvhs\" (UniqueName: \"kubernetes.io/projected/108122fc-a610-4d25-8945-5e30f770da7f-kube-api-access-rfvhs\") pod \"barbican-db-create-cbqnb\" (UID: \"108122fc-a610-4d25-8945-5e30f770da7f\") " pod="openstack/barbican-db-create-cbqnb" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.805347 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfvhs\" (UniqueName: \"kubernetes.io/projected/108122fc-a610-4d25-8945-5e30f770da7f-kube-api-access-rfvhs\") pod \"barbican-db-create-cbqnb\" (UID: \"108122fc-a610-4d25-8945-5e30f770da7f\") " pod="openstack/barbican-db-create-cbqnb" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.806419 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqjlp\" (UniqueName: \"kubernetes.io/projected/aa176e96-74ca-47df-9388-08146166a449-kube-api-access-wqjlp\") pod \"neutron-db-create-pmhbz\" (UID: \"aa176e96-74ca-47df-9388-08146166a449\") " pod="openstack/neutron-db-create-pmhbz" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.871007 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cbqnb" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.889614 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-config-data\") pod \"keystone-db-sync-vz2wg\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.889659 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x588p\" (UniqueName: \"kubernetes.io/projected/13598166-30eb-43b2-8a13-2e2ca72f58e9-kube-api-access-x588p\") pod \"keystone-db-sync-vz2wg\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.889727 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-combined-ca-bundle\") pod \"keystone-db-sync-vz2wg\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.893706 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-combined-ca-bundle\") pod \"keystone-db-sync-vz2wg\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.893792 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-config-data\") pod \"keystone-db-sync-vz2wg\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.909372 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x588p\" (UniqueName: \"kubernetes.io/projected/13598166-30eb-43b2-8a13-2e2ca72f58e9-kube-api-access-x588p\") pod \"keystone-db-sync-vz2wg\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.925027 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f" path="/var/lib/kubelet/pods/bfa1db03-f70b-4bd1-ad63-bb2fe8b79b3f/volumes" Oct 07 08:34:15 crc kubenswrapper[5025]: I1007 08:34:15.973758 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pmhbz" Oct 07 08:34:16 crc kubenswrapper[5025]: I1007 08:34:16.052827 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:19 crc kubenswrapper[5025]: I1007 08:34:19.645790 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:19 crc kubenswrapper[5025]: I1007 08:34:19.696985 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rtggt"] Oct 07 08:34:19 crc kubenswrapper[5025]: I1007 08:34:19.697222 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" podUID="9ec748e5-296b-4c72-979f-bf49a6dceb02" containerName="dnsmasq-dns" containerID="cri-o://74f7937aae91a27bf6b4f33470428498e55ace843ddec617a1ada5e8d5040a1e" gracePeriod=10 Oct 07 08:34:20 crc kubenswrapper[5025]: I1007 08:34:20.038101 5025 generic.go:334] "Generic (PLEG): container finished" podID="9ec748e5-296b-4c72-979f-bf49a6dceb02" containerID="74f7937aae91a27bf6b4f33470428498e55ace843ddec617a1ada5e8d5040a1e" exitCode=0 Oct 07 08:34:20 crc kubenswrapper[5025]: I1007 08:34:20.038471 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" event={"ID":"9ec748e5-296b-4c72-979f-bf49a6dceb02","Type":"ContainerDied","Data":"74f7937aae91a27bf6b4f33470428498e55ace843ddec617a1ada5e8d5040a1e"} Oct 07 08:34:20 crc kubenswrapper[5025]: I1007 08:34:20.792042 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" podUID="9ec748e5-296b-4c72-979f-bf49a6dceb02" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Oct 07 08:34:22 crc kubenswrapper[5025]: I1007 08:34:22.934012 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.018260 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-dns-svc\") pod \"9ec748e5-296b-4c72-979f-bf49a6dceb02\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.018305 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-config\") pod \"9ec748e5-296b-4c72-979f-bf49a6dceb02\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.018343 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-sb\") pod \"9ec748e5-296b-4c72-979f-bf49a6dceb02\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.018408 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-nb\") pod \"9ec748e5-296b-4c72-979f-bf49a6dceb02\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.018443 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcmxv\" (UniqueName: \"kubernetes.io/projected/9ec748e5-296b-4c72-979f-bf49a6dceb02-kube-api-access-gcmxv\") pod \"9ec748e5-296b-4c72-979f-bf49a6dceb02\" (UID: \"9ec748e5-296b-4c72-979f-bf49a6dceb02\") " Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.022827 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec748e5-296b-4c72-979f-bf49a6dceb02-kube-api-access-gcmxv" (OuterVolumeSpecName: "kube-api-access-gcmxv") pod "9ec748e5-296b-4c72-979f-bf49a6dceb02" (UID: "9ec748e5-296b-4c72-979f-bf49a6dceb02"). InnerVolumeSpecName "kube-api-access-gcmxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.057014 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ec748e5-296b-4c72-979f-bf49a6dceb02" (UID: "9ec748e5-296b-4c72-979f-bf49a6dceb02"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.058168 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-config" (OuterVolumeSpecName: "config") pod "9ec748e5-296b-4c72-979f-bf49a6dceb02" (UID: "9ec748e5-296b-4c72-979f-bf49a6dceb02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.066942 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ec748e5-296b-4c72-979f-bf49a6dceb02" (UID: "9ec748e5-296b-4c72-979f-bf49a6dceb02"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.074017 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ec748e5-296b-4c72-979f-bf49a6dceb02" (UID: "9ec748e5-296b-4c72-979f-bf49a6dceb02"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.079502 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" event={"ID":"9ec748e5-296b-4c72-979f-bf49a6dceb02","Type":"ContainerDied","Data":"5f370a52cfb12c373a8ae20d389d6de6f6bb04b43413b6e1ff0492a186ed3b05"} Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.079573 5025 scope.go:117] "RemoveContainer" containerID="74f7937aae91a27bf6b4f33470428498e55ace843ddec617a1ada5e8d5040a1e" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.079814 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rtggt" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.116114 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rtggt"] Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.120535 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.120576 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.120587 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.120599 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcmxv\" (UniqueName: \"kubernetes.io/projected/9ec748e5-296b-4c72-979f-bf49a6dceb02-kube-api-access-gcmxv\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.120608 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec748e5-296b-4c72-979f-bf49a6dceb02-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.123091 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rtggt"] Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.123964 5025 scope.go:117] "RemoveContainer" containerID="f294791cfaa038ec16871085a092ef3c33828bea6c8105be9a7cad946bafd253" Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.247634 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lcsnf"] Oct 07 08:34:23 crc kubenswrapper[5025]: W1007 08:34:23.252439 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc065f6f6_5301_4933_8794_606f5b61464e.slice/crio-d37321c18d2057518f934bbbcecdf8d27a791681a4235de9ed55ede15d31f438 WatchSource:0}: Error finding container d37321c18d2057518f934bbbcecdf8d27a791681a4235de9ed55ede15d31f438: Status 404 returned error can't find the container with id d37321c18d2057518f934bbbcecdf8d27a791681a4235de9ed55ede15d31f438 Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.263308 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vz2wg"] Oct 07 08:34:23 crc kubenswrapper[5025]: W1007 08:34:23.269130 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13598166_30eb_43b2_8a13_2e2ca72f58e9.slice/crio-39b186f4ec501405817ec6606b0cc4a4f994d35ba597563d6b239a4f9b2277b5 WatchSource:0}: Error finding container 39b186f4ec501405817ec6606b0cc4a4f994d35ba597563d6b239a4f9b2277b5: Status 404 returned error can't find the container with id 39b186f4ec501405817ec6606b0cc4a4f994d35ba597563d6b239a4f9b2277b5 Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.273510 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-pmhbz"] Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.338381 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cbqnb"] Oct 07 08:34:23 crc kubenswrapper[5025]: I1007 08:34:23.934072 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec748e5-296b-4c72-979f-bf49a6dceb02" path="/var/lib/kubelet/pods/9ec748e5-296b-4c72-979f-bf49a6dceb02/volumes" Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.089631 5025 generic.go:334] "Generic (PLEG): container finished" podID="aa176e96-74ca-47df-9388-08146166a449" containerID="503c88cbfff97597d46cee5bfce8f453317fb9225c7e0c1875ee907541a00991" exitCode=0 Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.090110 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-pmhbz" event={"ID":"aa176e96-74ca-47df-9388-08146166a449","Type":"ContainerDied","Data":"503c88cbfff97597d46cee5bfce8f453317fb9225c7e0c1875ee907541a00991"} Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.090136 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-pmhbz" event={"ID":"aa176e96-74ca-47df-9388-08146166a449","Type":"ContainerStarted","Data":"7920ff752e9c78de88020e4e4f93498cc98945e1e7ea1d69e7445e250d56e6a7"} Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.091994 5025 generic.go:334] "Generic (PLEG): container finished" podID="c065f6f6-5301-4933-8794-606f5b61464e" containerID="843d4136d3328eeac05dabc41a4cb8468846a3512ab6b8c6f9ddc40fb150090d" exitCode=0 Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.092093 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lcsnf" event={"ID":"c065f6f6-5301-4933-8794-606f5b61464e","Type":"ContainerDied","Data":"843d4136d3328eeac05dabc41a4cb8468846a3512ab6b8c6f9ddc40fb150090d"} Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.092164 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lcsnf" event={"ID":"c065f6f6-5301-4933-8794-606f5b61464e","Type":"ContainerStarted","Data":"d37321c18d2057518f934bbbcecdf8d27a791681a4235de9ed55ede15d31f438"} Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.093714 5025 generic.go:334] "Generic (PLEG): container finished" podID="108122fc-a610-4d25-8945-5e30f770da7f" containerID="8cc6b65c04574b0ccb31f07b887838e143249f7a2e3d197999fac731f4f5cdf3" exitCode=0 Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.093814 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cbqnb" event={"ID":"108122fc-a610-4d25-8945-5e30f770da7f","Type":"ContainerDied","Data":"8cc6b65c04574b0ccb31f07b887838e143249f7a2e3d197999fac731f4f5cdf3"} Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.093839 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cbqnb" event={"ID":"108122fc-a610-4d25-8945-5e30f770da7f","Type":"ContainerStarted","Data":"6efb217fe8c662c50073a79a1ed115ea7125f46dc1dbce0f8e5a471862a81f4c"} Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.099254 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vz2wg" event={"ID":"13598166-30eb-43b2-8a13-2e2ca72f58e9","Type":"ContainerStarted","Data":"39b186f4ec501405817ec6606b0cc4a4f994d35ba597563d6b239a4f9b2277b5"} Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.100853 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gdh6c" event={"ID":"f2c43be6-62b6-4546-af1a-c3e1835da494","Type":"ContainerStarted","Data":"f5f91a3a15b66c0647f9504a10bc37bb459966180b4d1e14f9fb21cb10d63d80"} Oct 07 08:34:24 crc kubenswrapper[5025]: I1007 08:34:24.162053 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gdh6c" podStartSLOduration=2.732511738 podStartE2EDuration="15.162029019s" podCreationTimestamp="2025-10-07 08:34:09 +0000 UTC" firstStartedPulling="2025-10-07 08:34:10.384216009 +0000 UTC m=+1057.193530153" lastFinishedPulling="2025-10-07 08:34:22.81373328 +0000 UTC m=+1069.623047434" observedRunningTime="2025-10-07 08:34:24.131631811 +0000 UTC m=+1070.940945975" watchObservedRunningTime="2025-10-07 08:34:24.162029019 +0000 UTC m=+1070.971343163" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.138342 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lcsnf" event={"ID":"c065f6f6-5301-4933-8794-606f5b61464e","Type":"ContainerDied","Data":"d37321c18d2057518f934bbbcecdf8d27a791681a4235de9ed55ede15d31f438"} Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.138885 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d37321c18d2057518f934bbbcecdf8d27a791681a4235de9ed55ede15d31f438" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.140991 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cbqnb" event={"ID":"108122fc-a610-4d25-8945-5e30f770da7f","Type":"ContainerDied","Data":"6efb217fe8c662c50073a79a1ed115ea7125f46dc1dbce0f8e5a471862a81f4c"} Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.141013 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6efb217fe8c662c50073a79a1ed115ea7125f46dc1dbce0f8e5a471862a81f4c" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.143393 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-pmhbz" event={"ID":"aa176e96-74ca-47df-9388-08146166a449","Type":"ContainerDied","Data":"7920ff752e9c78de88020e4e4f93498cc98945e1e7ea1d69e7445e250d56e6a7"} Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.143586 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7920ff752e9c78de88020e4e4f93498cc98945e1e7ea1d69e7445e250d56e6a7" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.255519 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lcsnf" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.261506 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pmhbz" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.268390 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cbqnb" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.318695 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzgj7\" (UniqueName: \"kubernetes.io/projected/c065f6f6-5301-4933-8794-606f5b61464e-kube-api-access-qzgj7\") pod \"c065f6f6-5301-4933-8794-606f5b61464e\" (UID: \"c065f6f6-5301-4933-8794-606f5b61464e\") " Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.319478 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfvhs\" (UniqueName: \"kubernetes.io/projected/108122fc-a610-4d25-8945-5e30f770da7f-kube-api-access-rfvhs\") pod \"108122fc-a610-4d25-8945-5e30f770da7f\" (UID: \"108122fc-a610-4d25-8945-5e30f770da7f\") " Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.319601 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqjlp\" (UniqueName: \"kubernetes.io/projected/aa176e96-74ca-47df-9388-08146166a449-kube-api-access-wqjlp\") pod \"aa176e96-74ca-47df-9388-08146166a449\" (UID: \"aa176e96-74ca-47df-9388-08146166a449\") " Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.331431 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/108122fc-a610-4d25-8945-5e30f770da7f-kube-api-access-rfvhs" (OuterVolumeSpecName: "kube-api-access-rfvhs") pod "108122fc-a610-4d25-8945-5e30f770da7f" (UID: "108122fc-a610-4d25-8945-5e30f770da7f"). InnerVolumeSpecName "kube-api-access-rfvhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.340461 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa176e96-74ca-47df-9388-08146166a449-kube-api-access-wqjlp" (OuterVolumeSpecName: "kube-api-access-wqjlp") pod "aa176e96-74ca-47df-9388-08146166a449" (UID: "aa176e96-74ca-47df-9388-08146166a449"). InnerVolumeSpecName "kube-api-access-wqjlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.340672 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c065f6f6-5301-4933-8794-606f5b61464e-kube-api-access-qzgj7" (OuterVolumeSpecName: "kube-api-access-qzgj7") pod "c065f6f6-5301-4933-8794-606f5b61464e" (UID: "c065f6f6-5301-4933-8794-606f5b61464e"). InnerVolumeSpecName "kube-api-access-qzgj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.422096 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfvhs\" (UniqueName: \"kubernetes.io/projected/108122fc-a610-4d25-8945-5e30f770da7f-kube-api-access-rfvhs\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.422132 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqjlp\" (UniqueName: \"kubernetes.io/projected/aa176e96-74ca-47df-9388-08146166a449-kube-api-access-wqjlp\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:28 crc kubenswrapper[5025]: I1007 08:34:28.422144 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzgj7\" (UniqueName: \"kubernetes.io/projected/c065f6f6-5301-4933-8794-606f5b61464e-kube-api-access-qzgj7\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:29 crc kubenswrapper[5025]: I1007 08:34:29.161652 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lcsnf" Oct 07 08:34:29 crc kubenswrapper[5025]: I1007 08:34:29.161696 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vz2wg" event={"ID":"13598166-30eb-43b2-8a13-2e2ca72f58e9","Type":"ContainerStarted","Data":"7ab426fa5e4d1132168e5853d0fb94d09b71ae76e12935bbb94a56b4cfdc5f2b"} Oct 07 08:34:29 crc kubenswrapper[5025]: I1007 08:34:29.161779 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cbqnb" Oct 07 08:34:29 crc kubenswrapper[5025]: I1007 08:34:29.161833 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pmhbz" Oct 07 08:34:29 crc kubenswrapper[5025]: I1007 08:34:29.182988 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vz2wg" podStartSLOduration=9.349191667 podStartE2EDuration="14.182972944s" podCreationTimestamp="2025-10-07 08:34:15 +0000 UTC" firstStartedPulling="2025-10-07 08:34:23.279327434 +0000 UTC m=+1070.088641578" lastFinishedPulling="2025-10-07 08:34:28.113108661 +0000 UTC m=+1074.922422855" observedRunningTime="2025-10-07 08:34:29.182132328 +0000 UTC m=+1075.991446482" watchObservedRunningTime="2025-10-07 08:34:29.182972944 +0000 UTC m=+1075.992287088" Oct 07 08:34:30 crc kubenswrapper[5025]: I1007 08:34:30.172361 5025 generic.go:334] "Generic (PLEG): container finished" podID="f2c43be6-62b6-4546-af1a-c3e1835da494" containerID="f5f91a3a15b66c0647f9504a10bc37bb459966180b4d1e14f9fb21cb10d63d80" exitCode=0 Oct 07 08:34:30 crc kubenswrapper[5025]: I1007 08:34:30.172498 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gdh6c" event={"ID":"f2c43be6-62b6-4546-af1a-c3e1835da494","Type":"ContainerDied","Data":"f5f91a3a15b66c0647f9504a10bc37bb459966180b4d1e14f9fb21cb10d63d80"} Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.647837 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.680437 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c255h\" (UniqueName: \"kubernetes.io/projected/f2c43be6-62b6-4546-af1a-c3e1835da494-kube-api-access-c255h\") pod \"f2c43be6-62b6-4546-af1a-c3e1835da494\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.680622 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-combined-ca-bundle\") pod \"f2c43be6-62b6-4546-af1a-c3e1835da494\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.680713 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-db-sync-config-data\") pod \"f2c43be6-62b6-4546-af1a-c3e1835da494\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.680896 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-config-data\") pod \"f2c43be6-62b6-4546-af1a-c3e1835da494\" (UID: \"f2c43be6-62b6-4546-af1a-c3e1835da494\") " Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.689147 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f2c43be6-62b6-4546-af1a-c3e1835da494" (UID: "f2c43be6-62b6-4546-af1a-c3e1835da494"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.689450 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c43be6-62b6-4546-af1a-c3e1835da494-kube-api-access-c255h" (OuterVolumeSpecName: "kube-api-access-c255h") pod "f2c43be6-62b6-4546-af1a-c3e1835da494" (UID: "f2c43be6-62b6-4546-af1a-c3e1835da494"). InnerVolumeSpecName "kube-api-access-c255h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.716524 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2c43be6-62b6-4546-af1a-c3e1835da494" (UID: "f2c43be6-62b6-4546-af1a-c3e1835da494"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.762251 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-config-data" (OuterVolumeSpecName: "config-data") pod "f2c43be6-62b6-4546-af1a-c3e1835da494" (UID: "f2c43be6-62b6-4546-af1a-c3e1835da494"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.783467 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c255h\" (UniqueName: \"kubernetes.io/projected/f2c43be6-62b6-4546-af1a-c3e1835da494-kube-api-access-c255h\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.783532 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.783599 5025 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:31 crc kubenswrapper[5025]: I1007 08:34:31.783611 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c43be6-62b6-4546-af1a-c3e1835da494-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.194020 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gdh6c" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.194013 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gdh6c" event={"ID":"f2c43be6-62b6-4546-af1a-c3e1835da494","Type":"ContainerDied","Data":"3a44845fa7f97ff075f94459b1d36200449eb785583169a725ddd1c7600b4d79"} Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.194203 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a44845fa7f97ff075f94459b1d36200449eb785583169a725ddd1c7600b4d79" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.196287 5025 generic.go:334] "Generic (PLEG): container finished" podID="13598166-30eb-43b2-8a13-2e2ca72f58e9" containerID="7ab426fa5e4d1132168e5853d0fb94d09b71ae76e12935bbb94a56b4cfdc5f2b" exitCode=0 Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.196350 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vz2wg" event={"ID":"13598166-30eb-43b2-8a13-2e2ca72f58e9","Type":"ContainerDied","Data":"7ab426fa5e4d1132168e5853d0fb94d09b71ae76e12935bbb94a56b4cfdc5f2b"} Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.582463 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-znfc5"] Oct 07 08:34:32 crc kubenswrapper[5025]: E1007 08:34:32.587747 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108122fc-a610-4d25-8945-5e30f770da7f" containerName="mariadb-database-create" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.587800 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="108122fc-a610-4d25-8945-5e30f770da7f" containerName="mariadb-database-create" Oct 07 08:34:32 crc kubenswrapper[5025]: E1007 08:34:32.587845 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec748e5-296b-4c72-979f-bf49a6dceb02" containerName="dnsmasq-dns" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.587853 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec748e5-296b-4c72-979f-bf49a6dceb02" containerName="dnsmasq-dns" Oct 07 08:34:32 crc kubenswrapper[5025]: E1007 08:34:32.587865 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa176e96-74ca-47df-9388-08146166a449" containerName="mariadb-database-create" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.587872 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa176e96-74ca-47df-9388-08146166a449" containerName="mariadb-database-create" Oct 07 08:34:32 crc kubenswrapper[5025]: E1007 08:34:32.587908 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec748e5-296b-4c72-979f-bf49a6dceb02" containerName="init" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.587915 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec748e5-296b-4c72-979f-bf49a6dceb02" containerName="init" Oct 07 08:34:32 crc kubenswrapper[5025]: E1007 08:34:32.587931 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c43be6-62b6-4546-af1a-c3e1835da494" containerName="glance-db-sync" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.587938 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c43be6-62b6-4546-af1a-c3e1835da494" containerName="glance-db-sync" Oct 07 08:34:32 crc kubenswrapper[5025]: E1007 08:34:32.587977 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c065f6f6-5301-4933-8794-606f5b61464e" containerName="mariadb-database-create" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.587983 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c065f6f6-5301-4933-8794-606f5b61464e" containerName="mariadb-database-create" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.588675 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ec748e5-296b-4c72-979f-bf49a6dceb02" containerName="dnsmasq-dns" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.588692 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c065f6f6-5301-4933-8794-606f5b61464e" containerName="mariadb-database-create" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.588723 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2c43be6-62b6-4546-af1a-c3e1835da494" containerName="glance-db-sync" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.588737 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="108122fc-a610-4d25-8945-5e30f770da7f" containerName="mariadb-database-create" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.588749 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa176e96-74ca-47df-9388-08146166a449" containerName="mariadb-database-create" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.590324 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.606683 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-znfc5"] Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.700007 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-config\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.700047 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.700077 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.700192 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.700318 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw5cq\" (UniqueName: \"kubernetes.io/projected/7e5caa67-49ae-4d86-bdeb-a51885061759-kube-api-access-cw5cq\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.700339 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-svc\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.802159 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-config\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.802212 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.802237 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.802266 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.803223 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-config\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.803387 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.803406 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.803905 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw5cq\" (UniqueName: \"kubernetes.io/projected/7e5caa67-49ae-4d86-bdeb-a51885061759-kube-api-access-cw5cq\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.804381 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.804843 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-svc\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.803940 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-svc\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.821326 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw5cq\" (UniqueName: \"kubernetes.io/projected/7e5caa67-49ae-4d86-bdeb-a51885061759-kube-api-access-cw5cq\") pod \"dnsmasq-dns-895cf5cf-znfc5\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:32 crc kubenswrapper[5025]: I1007 08:34:32.926906 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.376580 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-znfc5"] Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.469583 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.530463 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-combined-ca-bundle\") pod \"13598166-30eb-43b2-8a13-2e2ca72f58e9\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.530550 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-config-data\") pod \"13598166-30eb-43b2-8a13-2e2ca72f58e9\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.530651 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x588p\" (UniqueName: \"kubernetes.io/projected/13598166-30eb-43b2-8a13-2e2ca72f58e9-kube-api-access-x588p\") pod \"13598166-30eb-43b2-8a13-2e2ca72f58e9\" (UID: \"13598166-30eb-43b2-8a13-2e2ca72f58e9\") " Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.542079 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13598166-30eb-43b2-8a13-2e2ca72f58e9-kube-api-access-x588p" (OuterVolumeSpecName: "kube-api-access-x588p") pod "13598166-30eb-43b2-8a13-2e2ca72f58e9" (UID: "13598166-30eb-43b2-8a13-2e2ca72f58e9"). InnerVolumeSpecName "kube-api-access-x588p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.569639 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13598166-30eb-43b2-8a13-2e2ca72f58e9" (UID: "13598166-30eb-43b2-8a13-2e2ca72f58e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.590685 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-config-data" (OuterVolumeSpecName: "config-data") pod "13598166-30eb-43b2-8a13-2e2ca72f58e9" (UID: "13598166-30eb-43b2-8a13-2e2ca72f58e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.632273 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.632306 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13598166-30eb-43b2-8a13-2e2ca72f58e9-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:33 crc kubenswrapper[5025]: I1007 08:34:33.632315 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x588p\" (UniqueName: \"kubernetes.io/projected/13598166-30eb-43b2-8a13-2e2ca72f58e9-kube-api-access-x588p\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.211884 5025 generic.go:334] "Generic (PLEG): container finished" podID="7e5caa67-49ae-4d86-bdeb-a51885061759" containerID="077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd" exitCode=0 Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.212102 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" event={"ID":"7e5caa67-49ae-4d86-bdeb-a51885061759","Type":"ContainerDied","Data":"077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd"} Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.212383 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" event={"ID":"7e5caa67-49ae-4d86-bdeb-a51885061759","Type":"ContainerStarted","Data":"3d5660b5e4cee3a392a51c11152bf5d65fc2600cf993d25d39c66c4cbac73ce6"} Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.214447 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vz2wg" event={"ID":"13598166-30eb-43b2-8a13-2e2ca72f58e9","Type":"ContainerDied","Data":"39b186f4ec501405817ec6606b0cc4a4f994d35ba597563d6b239a4f9b2277b5"} Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.214482 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39b186f4ec501405817ec6606b0cc4a4f994d35ba597563d6b239a4f9b2277b5" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.214527 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vz2wg" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.471207 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-znfc5"] Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.576878 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-csqwc"] Oct 07 08:34:34 crc kubenswrapper[5025]: E1007 08:34:34.577657 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13598166-30eb-43b2-8a13-2e2ca72f58e9" containerName="keystone-db-sync" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.577679 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="13598166-30eb-43b2-8a13-2e2ca72f58e9" containerName="keystone-db-sync" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.577906 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="13598166-30eb-43b2-8a13-2e2ca72f58e9" containerName="keystone-db-sync" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.578441 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.591403 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.591711 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nfvg7" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.592040 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.595158 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rr5wz"] Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.597074 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.605101 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.632756 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rr5wz"] Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654042 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-combined-ca-bundle\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654146 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-config-data\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654202 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654231 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-config\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654320 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654373 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkh5f\" (UniqueName: \"kubernetes.io/projected/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-kube-api-access-mkh5f\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654405 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654461 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-fernet-keys\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654492 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654572 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs2l5\" (UniqueName: \"kubernetes.io/projected/22dbbe5c-6887-419e-b067-7d400a3a99eb-kube-api-access-qs2l5\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654618 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-credential-keys\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654748 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-csqwc"] Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.654820 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-scripts\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769143 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-combined-ca-bundle\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769180 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-config-data\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769208 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769231 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-config\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769274 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769294 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkh5f\" (UniqueName: \"kubernetes.io/projected/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-kube-api-access-mkh5f\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769321 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769351 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-fernet-keys\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769370 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769392 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs2l5\" (UniqueName: \"kubernetes.io/projected/22dbbe5c-6887-419e-b067-7d400a3a99eb-kube-api-access-qs2l5\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769421 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-credential-keys\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.769469 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-scripts\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.772168 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.773353 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-config\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.773908 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.774719 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-scripts\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.775332 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.776820 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.806659 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-credential-keys\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.806991 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-combined-ca-bundle\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.807770 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-config-data\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.808096 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-fernet-keys\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.815572 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs2l5\" (UniqueName: \"kubernetes.io/projected/22dbbe5c-6887-419e-b067-7d400a3a99eb-kube-api-access-qs2l5\") pod \"dnsmasq-dns-6c9c9f998c-rr5wz\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.821159 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkh5f\" (UniqueName: \"kubernetes.io/projected/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-kube-api-access-mkh5f\") pod \"keystone-bootstrap-csqwc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.919378 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.936161 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.955778 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-d2nc2"] Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.957293 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.961869 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hfmd7" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.962071 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.962185 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.971591 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rr5wz"] Oct 07 08:34:34 crc kubenswrapper[5025]: I1007 08:34:34.981859 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d2nc2"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.061512 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.063560 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.076761 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw7rp\" (UniqueName: \"kubernetes.io/projected/5e980fa1-54dd-4f48-9a25-0b6090709927-kube-api-access-rw7rp\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.076799 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-combined-ca-bundle\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.076854 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-scripts\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.076916 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e980fa1-54dd-4f48-9a25-0b6090709927-logs\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.076961 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-config-data\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.077338 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.077533 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.083368 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fvkl9"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.084865 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.093802 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fvkl9"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.110351 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.179028 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.179439 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.179489 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q67tf\" (UniqueName: \"kubernetes.io/projected/c06c9321-437f-4b6a-b205-b56758517e75-kube-api-access-q67tf\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.179527 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-config\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.179578 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e980fa1-54dd-4f48-9a25-0b6090709927-logs\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.179607 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-log-httpd\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.179633 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180081 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e980fa1-54dd-4f48-9a25-0b6090709927-logs\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180127 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180209 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-run-httpd\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180226 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-config-data\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180242 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180263 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-config-data\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180279 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180330 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7rp\" (UniqueName: \"kubernetes.io/projected/5e980fa1-54dd-4f48-9a25-0b6090709927-kube-api-access-rw7rp\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180352 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-combined-ca-bundle\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180387 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-scripts\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180414 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-scripts\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.180439 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7t5q\" (UniqueName: \"kubernetes.io/projected/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-kube-api-access-w7t5q\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.195219 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-config-data\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.195624 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-combined-ca-bundle\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.210282 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw7rp\" (UniqueName: \"kubernetes.io/projected/5e980fa1-54dd-4f48-9a25-0b6090709927-kube-api-access-rw7rp\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.218094 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-scripts\") pod \"placement-db-sync-d2nc2\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.230607 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" event={"ID":"7e5caa67-49ae-4d86-bdeb-a51885061759","Type":"ContainerStarted","Data":"a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836"} Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.230768 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" podUID="7e5caa67-49ae-4d86-bdeb-a51885061759" containerName="dnsmasq-dns" containerID="cri-o://a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836" gracePeriod=10 Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.230855 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.254257 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" podStartSLOduration=3.254223803 podStartE2EDuration="3.254223803s" podCreationTimestamp="2025-10-07 08:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:35.253174631 +0000 UTC m=+1082.062488775" watchObservedRunningTime="2025-10-07 08:34:35.254223803 +0000 UTC m=+1082.063537947" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281700 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-scripts\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281753 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7t5q\" (UniqueName: \"kubernetes.io/projected/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-kube-api-access-w7t5q\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281787 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281809 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281831 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q67tf\" (UniqueName: \"kubernetes.io/projected/c06c9321-437f-4b6a-b205-b56758517e75-kube-api-access-q67tf\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281850 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-config\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281868 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-log-httpd\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281887 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281909 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281943 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-run-httpd\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281959 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-config-data\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281973 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.281990 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.282677 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-log-httpd\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.284100 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-config\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.284295 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.284592 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-run-httpd\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.285452 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.286085 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.286488 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.287096 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.288252 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-scripts\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.293331 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-config-data\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.301835 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.305231 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q67tf\" (UniqueName: \"kubernetes.io/projected/c06c9321-437f-4b6a-b205-b56758517e75-kube-api-access-q67tf\") pod \"ceilometer-0\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.318649 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7t5q\" (UniqueName: \"kubernetes.io/projected/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-kube-api-access-w7t5q\") pod \"dnsmasq-dns-57c957c4ff-fvkl9\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.369504 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.409063 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.413057 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b478-account-create-pg87g"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.414139 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b478-account-create-pg87g" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.416766 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.424117 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b478-account-create-pg87g"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.459745 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.484980 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64kq7\" (UniqueName: \"kubernetes.io/projected/40c6fe82-d552-450a-845b-8c5da8bf5423-kube-api-access-64kq7\") pod \"cinder-b478-account-create-pg87g\" (UID: \"40c6fe82-d552-450a-845b-8c5da8bf5423\") " pod="openstack/cinder-b478-account-create-pg87g" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.517252 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9cc1-account-create-kh69v"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.518923 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9cc1-account-create-kh69v" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.526976 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.541501 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9cc1-account-create-kh69v"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.586387 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75xb2\" (UniqueName: \"kubernetes.io/projected/d202ac0c-e468-4038-8232-a4e1812d3698-kube-api-access-75xb2\") pod \"barbican-9cc1-account-create-kh69v\" (UID: \"d202ac0c-e468-4038-8232-a4e1812d3698\") " pod="openstack/barbican-9cc1-account-create-kh69v" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.586525 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64kq7\" (UniqueName: \"kubernetes.io/projected/40c6fe82-d552-450a-845b-8c5da8bf5423-kube-api-access-64kq7\") pod \"cinder-b478-account-create-pg87g\" (UID: \"40c6fe82-d552-450a-845b-8c5da8bf5423\") " pod="openstack/cinder-b478-account-create-pg87g" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.587994 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rr5wz"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.611480 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64kq7\" (UniqueName: \"kubernetes.io/projected/40c6fe82-d552-450a-845b-8c5da8bf5423-kube-api-access-64kq7\") pod \"cinder-b478-account-create-pg87g\" (UID: \"40c6fe82-d552-450a-845b-8c5da8bf5423\") " pod="openstack/cinder-b478-account-create-pg87g" Oct 07 08:34:35 crc kubenswrapper[5025]: W1007 08:34:35.623260 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22dbbe5c_6887_419e_b067_7d400a3a99eb.slice/crio-4b4035989992b322236b70ffaceed708fefdc8a6116598bb3c2053c5805902d2 WatchSource:0}: Error finding container 4b4035989992b322236b70ffaceed708fefdc8a6116598bb3c2053c5805902d2: Status 404 returned error can't find the container with id 4b4035989992b322236b70ffaceed708fefdc8a6116598bb3c2053c5805902d2 Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.662256 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-csqwc"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.672108 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.688811 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75xb2\" (UniqueName: \"kubernetes.io/projected/d202ac0c-e468-4038-8232-a4e1812d3698-kube-api-access-75xb2\") pod \"barbican-9cc1-account-create-kh69v\" (UID: \"d202ac0c-e468-4038-8232-a4e1812d3698\") " pod="openstack/barbican-9cc1-account-create-kh69v" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.720295 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75xb2\" (UniqueName: \"kubernetes.io/projected/d202ac0c-e468-4038-8232-a4e1812d3698-kube-api-access-75xb2\") pod \"barbican-9cc1-account-create-kh69v\" (UID: \"d202ac0c-e468-4038-8232-a4e1812d3698\") " pod="openstack/barbican-9cc1-account-create-kh69v" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.721898 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0ee4-account-create-wtml5"] Oct 07 08:34:35 crc kubenswrapper[5025]: E1007 08:34:35.724443 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5caa67-49ae-4d86-bdeb-a51885061759" containerName="init" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.724590 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5caa67-49ae-4d86-bdeb-a51885061759" containerName="init" Oct 07 08:34:35 crc kubenswrapper[5025]: E1007 08:34:35.724619 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5caa67-49ae-4d86-bdeb-a51885061759" containerName="dnsmasq-dns" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.724740 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5caa67-49ae-4d86-bdeb-a51885061759" containerName="dnsmasq-dns" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.725287 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5caa67-49ae-4d86-bdeb-a51885061759" containerName="dnsmasq-dns" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.726356 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0ee4-account-create-wtml5" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.729512 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0ee4-account-create-wtml5"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.729746 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.738876 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b478-account-create-pg87g" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.769504 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.772194 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.774479 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qxgpv" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.774674 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.774799 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.789794 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-svc\") pod \"7e5caa67-49ae-4d86-bdeb-a51885061759\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.789869 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-config\") pod \"7e5caa67-49ae-4d86-bdeb-a51885061759\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.790285 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw5cq\" (UniqueName: \"kubernetes.io/projected/7e5caa67-49ae-4d86-bdeb-a51885061759-kube-api-access-cw5cq\") pod \"7e5caa67-49ae-4d86-bdeb-a51885061759\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.790371 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-sb\") pod \"7e5caa67-49ae-4d86-bdeb-a51885061759\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.790436 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-nb\") pod \"7e5caa67-49ae-4d86-bdeb-a51885061759\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.790494 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-swift-storage-0\") pod \"7e5caa67-49ae-4d86-bdeb-a51885061759\" (UID: \"7e5caa67-49ae-4d86-bdeb-a51885061759\") " Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.790688 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt4rk\" (UniqueName: \"kubernetes.io/projected/6d9ff352-f9b1-4458-841f-08b02c493ab1-kube-api-access-qt4rk\") pod \"neutron-0ee4-account-create-wtml5\" (UID: \"6d9ff352-f9b1-4458-841f-08b02c493ab1\") " pod="openstack/neutron-0ee4-account-create-wtml5" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.800040 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5caa67-49ae-4d86-bdeb-a51885061759-kube-api-access-cw5cq" (OuterVolumeSpecName: "kube-api-access-cw5cq") pod "7e5caa67-49ae-4d86-bdeb-a51885061759" (UID: "7e5caa67-49ae-4d86-bdeb-a51885061759"). InnerVolumeSpecName "kube-api-access-cw5cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.827484 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.839213 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9cc1-account-create-kh69v" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.853467 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.855031 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.860520 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.869144 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.900410 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt4rk\" (UniqueName: \"kubernetes.io/projected/6d9ff352-f9b1-4458-841f-08b02c493ab1-kube-api-access-qt4rk\") pod \"neutron-0ee4-account-create-wtml5\" (UID: \"6d9ff352-f9b1-4458-841f-08b02c493ab1\") " pod="openstack/neutron-0ee4-account-create-wtml5" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.900733 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-config-data\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.900821 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.900897 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-logs\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.901026 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.901139 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgz72\" (UniqueName: \"kubernetes.io/projected/85f45f95-1a5a-4463-86ca-3887295eb923-kube-api-access-rgz72\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.901235 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.901344 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-scripts\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.901534 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw5cq\" (UniqueName: \"kubernetes.io/projected/7e5caa67-49ae-4d86-bdeb-a51885061759-kube-api-access-cw5cq\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:35 crc kubenswrapper[5025]: I1007 08:34:35.919364 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt4rk\" (UniqueName: \"kubernetes.io/projected/6d9ff352-f9b1-4458-841f-08b02c493ab1-kube-api-access-qt4rk\") pod \"neutron-0ee4-account-create-wtml5\" (UID: \"6d9ff352-f9b1-4458-841f-08b02c493ab1\") " pod="openstack/neutron-0ee4-account-create-wtml5" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:35.991527 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e5caa67-49ae-4d86-bdeb-a51885061759" (UID: "7e5caa67-49ae-4d86-bdeb-a51885061759"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.002856 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e5caa67-49ae-4d86-bdeb-a51885061759" (UID: "7e5caa67-49ae-4d86-bdeb-a51885061759"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.003757 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.003801 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.003821 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-logs\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.003875 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-config-data\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.003891 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.003908 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-logs\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.003958 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.003975 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnjj\" (UniqueName: \"kubernetes.io/projected/e65e0b08-da77-408a-b6bd-7f6a4c177463-kube-api-access-cgnjj\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004005 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-scripts\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004029 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004049 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-config-data\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004070 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgz72\" (UniqueName: \"kubernetes.io/projected/85f45f95-1a5a-4463-86ca-3887295eb923-kube-api-access-rgz72\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004096 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004116 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-scripts\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004161 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004171 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004277 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004511 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-logs\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.004773 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.012379 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-config-data\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.013442 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-scripts\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.020247 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.050868 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0ee4-account-create-wtml5" Oct 07 08:34:36 crc kubenswrapper[5025]: W1007 08:34:36.057112 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc06c9321_437f_4b6a_b205_b56758517e75.slice/crio-f1142fd83b1ecfcf9e7b7332fc1fd031487b8b56449a7df111db8d9351d140d0 WatchSource:0}: Error finding container f1142fd83b1ecfcf9e7b7332fc1fd031487b8b56449a7df111db8d9351d140d0: Status 404 returned error can't find the container with id f1142fd83b1ecfcf9e7b7332fc1fd031487b8b56449a7df111db8d9351d140d0 Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.064024 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgz72\" (UniqueName: \"kubernetes.io/projected/85f45f95-1a5a-4463-86ca-3887295eb923-kube-api-access-rgz72\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.084128 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e5caa67-49ae-4d86-bdeb-a51885061759" (UID: "7e5caa67-49ae-4d86-bdeb-a51885061759"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.109355 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.109695 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.109739 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-logs\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.109771 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.109798 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.109817 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnjj\" (UniqueName: \"kubernetes.io/projected/e65e0b08-da77-408a-b6bd-7f6a4c177463-kube-api-access-cgnjj\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.109844 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-scripts\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.109876 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-config-data\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.109933 5025 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.110843 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.114469 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-logs\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.122107 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-config-data\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.126791 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-scripts\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.126835 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.132036 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnjj\" (UniqueName: \"kubernetes.io/projected/e65e0b08-da77-408a-b6bd-7f6a4c177463-kube-api-access-cgnjj\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.153613 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-config" (OuterVolumeSpecName: "config") pod "7e5caa67-49ae-4d86-bdeb-a51885061759" (UID: "7e5caa67-49ae-4d86-bdeb-a51885061759"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.157708 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e5caa67-49ae-4d86-bdeb-a51885061759" (UID: "7e5caa67-49ae-4d86-bdeb-a51885061759"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.167159 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.193265 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.213517 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.213560 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5caa67-49ae-4d86-bdeb-a51885061759-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.285785 5025 generic.go:334] "Generic (PLEG): container finished" podID="22dbbe5c-6887-419e-b067-7d400a3a99eb" containerID="ea26cf6644b37e87ef0a9b0dd5788224a973eafac80660442e5003294ff6af8c" exitCode=0 Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.293415 5025 generic.go:334] "Generic (PLEG): container finished" podID="7e5caa67-49ae-4d86-bdeb-a51885061759" containerID="a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836" exitCode=0 Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.293600 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" Oct 07 08:34:36 crc kubenswrapper[5025]: W1007 08:34:36.301845 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40c6fe82_d552_450a_845b_8c5da8bf5423.slice/crio-c775bcd3b452391dd3ebe6592c845452bd4b33f2fbba14d3696ed217bead9913 WatchSource:0}: Error finding container c775bcd3b452391dd3ebe6592c845452bd4b33f2fbba14d3696ed217bead9913: Status 404 returned error can't find the container with id c775bcd3b452391dd3ebe6592c845452bd4b33f2fbba14d3696ed217bead9913 Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360501 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" event={"ID":"22dbbe5c-6887-419e-b067-7d400a3a99eb","Type":"ContainerDied","Data":"ea26cf6644b37e87ef0a9b0dd5788224a973eafac80660442e5003294ff6af8c"} Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360559 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" event={"ID":"22dbbe5c-6887-419e-b067-7d400a3a99eb","Type":"ContainerStarted","Data":"4b4035989992b322236b70ffaceed708fefdc8a6116598bb3c2053c5805902d2"} Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360575 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" event={"ID":"7e5caa67-49ae-4d86-bdeb-a51885061759","Type":"ContainerDied","Data":"a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836"} Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360599 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-znfc5" event={"ID":"7e5caa67-49ae-4d86-bdeb-a51885061759","Type":"ContainerDied","Data":"3d5660b5e4cee3a392a51c11152bf5d65fc2600cf993d25d39c66c4cbac73ce6"} Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360611 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d2nc2"] Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360626 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360636 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fvkl9"] Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360647 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b478-account-create-pg87g"] Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360662 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d2nc2" event={"ID":"5e980fa1-54dd-4f48-9a25-0b6090709927","Type":"ContainerStarted","Data":"384fcf60bc3104fb78106106c7d0a7de5d3bb364bb1798f6817264afb3e4877a"} Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360673 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" event={"ID":"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414","Type":"ContainerStarted","Data":"8b67e72873f529b5099483bb26c162594f13020ffee9698534709cdabd4815c1"} Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360682 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerStarted","Data":"f1142fd83b1ecfcf9e7b7332fc1fd031487b8b56449a7df111db8d9351d140d0"} Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360710 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-csqwc" event={"ID":"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc","Type":"ContainerStarted","Data":"8abaa31c888456c2b915c44c4a8ecb0aabf91f427fc7281273aa2aabc247409d"} Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.360709 5025 scope.go:117] "RemoveContainer" containerID="a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.396936 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.407152 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.413441 5025 scope.go:117] "RemoveContainer" containerID="077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.421247 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-znfc5"] Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.428752 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-znfc5"] Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.444053 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9cc1-account-create-kh69v"] Oct 07 08:34:36 crc kubenswrapper[5025]: W1007 08:34:36.496026 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd202ac0c_e468_4038_8232_a4e1812d3698.slice/crio-7243e77fccb16648882ce56990af7eba31b7066bbf24ba8a7dc349a06d8bb2b9 WatchSource:0}: Error finding container 7243e77fccb16648882ce56990af7eba31b7066bbf24ba8a7dc349a06d8bb2b9: Status 404 returned error can't find the container with id 7243e77fccb16648882ce56990af7eba31b7066bbf24ba8a7dc349a06d8bb2b9 Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.556773 5025 scope.go:117] "RemoveContainer" containerID="a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836" Oct 07 08:34:36 crc kubenswrapper[5025]: E1007 08:34:36.562962 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836\": container with ID starting with a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836 not found: ID does not exist" containerID="a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.562999 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836"} err="failed to get container status \"a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836\": rpc error: code = NotFound desc = could not find container \"a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836\": container with ID starting with a1dbe4661c4752b0d01aacf06a43c6b0dcc8d8d2fd3ccdac4d4e4d0cfdfed836 not found: ID does not exist" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.563024 5025 scope.go:117] "RemoveContainer" containerID="077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd" Oct 07 08:34:36 crc kubenswrapper[5025]: E1007 08:34:36.565788 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd\": container with ID starting with 077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd not found: ID does not exist" containerID="077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.565849 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd"} err="failed to get container status \"077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd\": rpc error: code = NotFound desc = could not find container \"077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd\": container with ID starting with 077395bab325004e6de6a84db8862f22494031f5b78a32a0e6f5ea34d6f6ddcd not found: ID does not exist" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.653836 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0ee4-account-create-wtml5"] Oct 07 08:34:36 crc kubenswrapper[5025]: W1007 08:34:36.701391 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d9ff352_f9b1_4458_841f_08b02c493ab1.slice/crio-7f677e3f33e8c0f9aa8fec09d6dab17be74502487f47307e689b8b73105f08da WatchSource:0}: Error finding container 7f677e3f33e8c0f9aa8fec09d6dab17be74502487f47307e689b8b73105f08da: Status 404 returned error can't find the container with id 7f677e3f33e8c0f9aa8fec09d6dab17be74502487f47307e689b8b73105f08da Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.703446 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.719165 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs2l5\" (UniqueName: \"kubernetes.io/projected/22dbbe5c-6887-419e-b067-7d400a3a99eb-kube-api-access-qs2l5\") pod \"22dbbe5c-6887-419e-b067-7d400a3a99eb\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.719226 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-svc\") pod \"22dbbe5c-6887-419e-b067-7d400a3a99eb\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.719268 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-config\") pod \"22dbbe5c-6887-419e-b067-7d400a3a99eb\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.719391 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-swift-storage-0\") pod \"22dbbe5c-6887-419e-b067-7d400a3a99eb\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.719416 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-nb\") pod \"22dbbe5c-6887-419e-b067-7d400a3a99eb\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.719442 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-sb\") pod \"22dbbe5c-6887-419e-b067-7d400a3a99eb\" (UID: \"22dbbe5c-6887-419e-b067-7d400a3a99eb\") " Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.728961 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22dbbe5c-6887-419e-b067-7d400a3a99eb-kube-api-access-qs2l5" (OuterVolumeSpecName: "kube-api-access-qs2l5") pod "22dbbe5c-6887-419e-b067-7d400a3a99eb" (UID: "22dbbe5c-6887-419e-b067-7d400a3a99eb"). InnerVolumeSpecName "kube-api-access-qs2l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.764831 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-config" (OuterVolumeSpecName: "config") pod "22dbbe5c-6887-419e-b067-7d400a3a99eb" (UID: "22dbbe5c-6887-419e-b067-7d400a3a99eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.766510 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22dbbe5c-6887-419e-b067-7d400a3a99eb" (UID: "22dbbe5c-6887-419e-b067-7d400a3a99eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.773163 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22dbbe5c-6887-419e-b067-7d400a3a99eb" (UID: "22dbbe5c-6887-419e-b067-7d400a3a99eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.774059 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22dbbe5c-6887-419e-b067-7d400a3a99eb" (UID: "22dbbe5c-6887-419e-b067-7d400a3a99eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.784617 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "22dbbe5c-6887-419e-b067-7d400a3a99eb" (UID: "22dbbe5c-6887-419e-b067-7d400a3a99eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.826227 5025 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.826256 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.826265 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.826273 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs2l5\" (UniqueName: \"kubernetes.io/projected/22dbbe5c-6887-419e-b067-7d400a3a99eb-kube-api-access-qs2l5\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.826283 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:36 crc kubenswrapper[5025]: I1007 08:34:36.826291 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22dbbe5c-6887-419e-b067-7d400a3a99eb-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.085191 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:34:37 crc kubenswrapper[5025]: W1007 08:34:37.099733 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85f45f95_1a5a_4463_86ca_3887295eb923.slice/crio-4e0338f67726b85bff417e3f33597558f7c6c95f2e074a4a5a6711fe2db13853 WatchSource:0}: Error finding container 4e0338f67726b85bff417e3f33597558f7c6c95f2e074a4a5a6711fe2db13853: Status 404 returned error can't find the container with id 4e0338f67726b85bff417e3f33597558f7c6c95f2e074a4a5a6711fe2db13853 Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.187704 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:34:37 crc kubenswrapper[5025]: W1007 08:34:37.203664 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode65e0b08_da77_408a_b6bd_7f6a4c177463.slice/crio-b01afe7ab768ca4880635326fdb03e9aec980c7d0c9a8254eb02510bba24b7a3 WatchSource:0}: Error finding container b01afe7ab768ca4880635326fdb03e9aec980c7d0c9a8254eb02510bba24b7a3: Status 404 returned error can't find the container with id b01afe7ab768ca4880635326fdb03e9aec980c7d0c9a8254eb02510bba24b7a3 Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.331621 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-csqwc" event={"ID":"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc","Type":"ContainerStarted","Data":"cf4deced3355c59cc056a03b7fc99482fa8455897aeebbd5372e6c7fe67ff136"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.338397 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65e0b08-da77-408a-b6bd-7f6a4c177463","Type":"ContainerStarted","Data":"b01afe7ab768ca4880635326fdb03e9aec980c7d0c9a8254eb02510bba24b7a3"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.340825 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f45f95-1a5a-4463-86ca-3887295eb923","Type":"ContainerStarted","Data":"4e0338f67726b85bff417e3f33597558f7c6c95f2e074a4a5a6711fe2db13853"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.343992 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" event={"ID":"22dbbe5c-6887-419e-b067-7d400a3a99eb","Type":"ContainerDied","Data":"4b4035989992b322236b70ffaceed708fefdc8a6116598bb3c2053c5805902d2"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.344025 5025 scope.go:117] "RemoveContainer" containerID="ea26cf6644b37e87ef0a9b0dd5788224a973eafac80660442e5003294ff6af8c" Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.344109 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-rr5wz" Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.349432 5025 generic.go:334] "Generic (PLEG): container finished" podID="cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" containerID="e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e" exitCode=0 Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.349518 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" event={"ID":"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414","Type":"ContainerDied","Data":"e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.354286 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-csqwc" podStartSLOduration=3.354264653 podStartE2EDuration="3.354264653s" podCreationTimestamp="2025-10-07 08:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:37.351660372 +0000 UTC m=+1084.160974526" watchObservedRunningTime="2025-10-07 08:34:37.354264653 +0000 UTC m=+1084.163578807" Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.362614 5025 generic.go:334] "Generic (PLEG): container finished" podID="d202ac0c-e468-4038-8232-a4e1812d3698" containerID="68d1e4766470c867bae53bb3d98998e23095f3e94bb406fab13104d2f4ce5885" exitCode=0 Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.362761 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9cc1-account-create-kh69v" event={"ID":"d202ac0c-e468-4038-8232-a4e1812d3698","Type":"ContainerDied","Data":"68d1e4766470c867bae53bb3d98998e23095f3e94bb406fab13104d2f4ce5885"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.362796 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9cc1-account-create-kh69v" event={"ID":"d202ac0c-e468-4038-8232-a4e1812d3698","Type":"ContainerStarted","Data":"7243e77fccb16648882ce56990af7eba31b7066bbf24ba8a7dc349a06d8bb2b9"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.367916 5025 generic.go:334] "Generic (PLEG): container finished" podID="6d9ff352-f9b1-4458-841f-08b02c493ab1" containerID="1ace7a9d2660c1d0a09f90632c3c4b67d4bfd75ba7e66b931bc3aae8d78e1fac" exitCode=0 Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.367979 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0ee4-account-create-wtml5" event={"ID":"6d9ff352-f9b1-4458-841f-08b02c493ab1","Type":"ContainerDied","Data":"1ace7a9d2660c1d0a09f90632c3c4b67d4bfd75ba7e66b931bc3aae8d78e1fac"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.368001 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0ee4-account-create-wtml5" event={"ID":"6d9ff352-f9b1-4458-841f-08b02c493ab1","Type":"ContainerStarted","Data":"7f677e3f33e8c0f9aa8fec09d6dab17be74502487f47307e689b8b73105f08da"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.373041 5025 generic.go:334] "Generic (PLEG): container finished" podID="40c6fe82-d552-450a-845b-8c5da8bf5423" containerID="e24b883eadd9e8460cb557c6932fec97c79242558a35c3b28613b86a5ee5e691" exitCode=0 Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.373114 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b478-account-create-pg87g" event={"ID":"40c6fe82-d552-450a-845b-8c5da8bf5423","Type":"ContainerDied","Data":"e24b883eadd9e8460cb557c6932fec97c79242558a35c3b28613b86a5ee5e691"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.373167 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b478-account-create-pg87g" event={"ID":"40c6fe82-d552-450a-845b-8c5da8bf5423","Type":"ContainerStarted","Data":"c775bcd3b452391dd3ebe6592c845452bd4b33f2fbba14d3696ed217bead9913"} Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.482614 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rr5wz"] Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.514677 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-rr5wz"] Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.930095 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22dbbe5c-6887-419e-b067-7d400a3a99eb" path="/var/lib/kubelet/pods/22dbbe5c-6887-419e-b067-7d400a3a99eb/volumes" Oct 07 08:34:37 crc kubenswrapper[5025]: I1007 08:34:37.930990 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e5caa67-49ae-4d86-bdeb-a51885061759" path="/var/lib/kubelet/pods/7e5caa67-49ae-4d86-bdeb-a51885061759/volumes" Oct 07 08:34:38 crc kubenswrapper[5025]: I1007 08:34:38.388500 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" event={"ID":"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414","Type":"ContainerStarted","Data":"f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770"} Oct 07 08:34:38 crc kubenswrapper[5025]: I1007 08:34:38.389237 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:38 crc kubenswrapper[5025]: I1007 08:34:38.394506 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65e0b08-da77-408a-b6bd-7f6a4c177463","Type":"ContainerStarted","Data":"c6b4eb768cafab526a9cf485b555c941d24f46b1fa5553d353ac3bf4c2e9aaa4"} Oct 07 08:34:38 crc kubenswrapper[5025]: I1007 08:34:38.396965 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f45f95-1a5a-4463-86ca-3887295eb923","Type":"ContainerStarted","Data":"209fdcfb0eca6e3e34a309118aec4bcdac2e1add73d0bfa1577a5a3066c847da"} Oct 07 08:34:38 crc kubenswrapper[5025]: I1007 08:34:38.413484 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" podStartSLOduration=3.4134630440000002 podStartE2EDuration="3.413463044s" podCreationTimestamp="2025-10-07 08:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:38.406810867 +0000 UTC m=+1085.216125011" watchObservedRunningTime="2025-10-07 08:34:38.413463044 +0000 UTC m=+1085.222777188" Oct 07 08:34:38 crc kubenswrapper[5025]: I1007 08:34:38.955458 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:34:38 crc kubenswrapper[5025]: I1007 08:34:38.994364 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:34:39 crc kubenswrapper[5025]: I1007 08:34:39.025313 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:34:39 crc kubenswrapper[5025]: I1007 08:34:39.419962 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65e0b08-da77-408a-b6bd-7f6a4c177463","Type":"ContainerStarted","Data":"36e4e361ce3d8db18b52ca38e04c61c3924df347b68392f61299866ad4104d7f"} Oct 07 08:34:39 crc kubenswrapper[5025]: I1007 08:34:39.431819 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f45f95-1a5a-4463-86ca-3887295eb923","Type":"ContainerStarted","Data":"47bd3592f39912d4c133fb4fc61f5143f266b8ead44f5c722bf4eeb4ff1808ea"} Oct 07 08:34:39 crc kubenswrapper[5025]: I1007 08:34:39.447930 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.447913243 podStartE2EDuration="5.447913243s" podCreationTimestamp="2025-10-07 08:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:39.441321758 +0000 UTC m=+1086.250635902" watchObservedRunningTime="2025-10-07 08:34:39.447913243 +0000 UTC m=+1086.257227377" Oct 07 08:34:39 crc kubenswrapper[5025]: I1007 08:34:39.460814 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.460799865 podStartE2EDuration="5.460799865s" podCreationTimestamp="2025-10-07 08:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:39.459244727 +0000 UTC m=+1086.268558871" watchObservedRunningTime="2025-10-07 08:34:39.460799865 +0000 UTC m=+1086.270114009" Oct 07 08:34:39 crc kubenswrapper[5025]: E1007 08:34:39.731718 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13598166_30eb_43b2_8a13_2e2ca72f58e9.slice\": RecentStats: unable to find data in memory cache]" Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.464976 5025 generic.go:334] "Generic (PLEG): container finished" podID="3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" containerID="cf4deced3355c59cc056a03b7fc99482fa8455897aeebbd5372e6c7fe67ff136" exitCode=0 Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.465186 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="85f45f95-1a5a-4463-86ca-3887295eb923" containerName="glance-log" containerID="cri-o://209fdcfb0eca6e3e34a309118aec4bcdac2e1add73d0bfa1577a5a3066c847da" gracePeriod=30 Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.465284 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-csqwc" event={"ID":"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc","Type":"ContainerDied","Data":"cf4deced3355c59cc056a03b7fc99482fa8455897aeebbd5372e6c7fe67ff136"} Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.465436 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerName="glance-log" containerID="cri-o://c6b4eb768cafab526a9cf485b555c941d24f46b1fa5553d353ac3bf4c2e9aaa4" gracePeriod=30 Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.465765 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerName="glance-httpd" containerID="cri-o://36e4e361ce3d8db18b52ca38e04c61c3924df347b68392f61299866ad4104d7f" gracePeriod=30 Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.465797 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="85f45f95-1a5a-4463-86ca-3887295eb923" containerName="glance-httpd" containerID="cri-o://47bd3592f39912d4c133fb4fc61f5143f266b8ead44f5c722bf4eeb4ff1808ea" gracePeriod=30 Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.601164 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9cc1-account-create-kh69v" Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.702529 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75xb2\" (UniqueName: \"kubernetes.io/projected/d202ac0c-e468-4038-8232-a4e1812d3698-kube-api-access-75xb2\") pod \"d202ac0c-e468-4038-8232-a4e1812d3698\" (UID: \"d202ac0c-e468-4038-8232-a4e1812d3698\") " Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.728215 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d202ac0c-e468-4038-8232-a4e1812d3698-kube-api-access-75xb2" (OuterVolumeSpecName: "kube-api-access-75xb2") pod "d202ac0c-e468-4038-8232-a4e1812d3698" (UID: "d202ac0c-e468-4038-8232-a4e1812d3698"). InnerVolumeSpecName "kube-api-access-75xb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:40 crc kubenswrapper[5025]: I1007 08:34:40.805036 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75xb2\" (UniqueName: \"kubernetes.io/projected/d202ac0c-e468-4038-8232-a4e1812d3698-kube-api-access-75xb2\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.151060 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0ee4-account-create-wtml5" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.158657 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b478-account-create-pg87g" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.216617 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64kq7\" (UniqueName: \"kubernetes.io/projected/40c6fe82-d552-450a-845b-8c5da8bf5423-kube-api-access-64kq7\") pod \"40c6fe82-d552-450a-845b-8c5da8bf5423\" (UID: \"40c6fe82-d552-450a-845b-8c5da8bf5423\") " Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.216702 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt4rk\" (UniqueName: \"kubernetes.io/projected/6d9ff352-f9b1-4458-841f-08b02c493ab1-kube-api-access-qt4rk\") pod \"6d9ff352-f9b1-4458-841f-08b02c493ab1\" (UID: \"6d9ff352-f9b1-4458-841f-08b02c493ab1\") " Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.224806 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c6fe82-d552-450a-845b-8c5da8bf5423-kube-api-access-64kq7" (OuterVolumeSpecName: "kube-api-access-64kq7") pod "40c6fe82-d552-450a-845b-8c5da8bf5423" (UID: "40c6fe82-d552-450a-845b-8c5da8bf5423"). InnerVolumeSpecName "kube-api-access-64kq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.225019 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9ff352-f9b1-4458-841f-08b02c493ab1-kube-api-access-qt4rk" (OuterVolumeSpecName: "kube-api-access-qt4rk") pod "6d9ff352-f9b1-4458-841f-08b02c493ab1" (UID: "6d9ff352-f9b1-4458-841f-08b02c493ab1"). InnerVolumeSpecName "kube-api-access-qt4rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.319065 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64kq7\" (UniqueName: \"kubernetes.io/projected/40c6fe82-d552-450a-845b-8c5da8bf5423-kube-api-access-64kq7\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.319105 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt4rk\" (UniqueName: \"kubernetes.io/projected/6d9ff352-f9b1-4458-841f-08b02c493ab1-kube-api-access-qt4rk\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.475577 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b478-account-create-pg87g" event={"ID":"40c6fe82-d552-450a-845b-8c5da8bf5423","Type":"ContainerDied","Data":"c775bcd3b452391dd3ebe6592c845452bd4b33f2fbba14d3696ed217bead9913"} Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.475613 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c775bcd3b452391dd3ebe6592c845452bd4b33f2fbba14d3696ed217bead9913" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.475669 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b478-account-create-pg87g" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.484760 5025 generic.go:334] "Generic (PLEG): container finished" podID="85f45f95-1a5a-4463-86ca-3887295eb923" containerID="47bd3592f39912d4c133fb4fc61f5143f266b8ead44f5c722bf4eeb4ff1808ea" exitCode=0 Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.484817 5025 generic.go:334] "Generic (PLEG): container finished" podID="85f45f95-1a5a-4463-86ca-3887295eb923" containerID="209fdcfb0eca6e3e34a309118aec4bcdac2e1add73d0bfa1577a5a3066c847da" exitCode=143 Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.484847 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f45f95-1a5a-4463-86ca-3887295eb923","Type":"ContainerDied","Data":"47bd3592f39912d4c133fb4fc61f5143f266b8ead44f5c722bf4eeb4ff1808ea"} Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.484911 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f45f95-1a5a-4463-86ca-3887295eb923","Type":"ContainerDied","Data":"209fdcfb0eca6e3e34a309118aec4bcdac2e1add73d0bfa1577a5a3066c847da"} Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.486113 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9cc1-account-create-kh69v" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.486168 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9cc1-account-create-kh69v" event={"ID":"d202ac0c-e468-4038-8232-a4e1812d3698","Type":"ContainerDied","Data":"7243e77fccb16648882ce56990af7eba31b7066bbf24ba8a7dc349a06d8bb2b9"} Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.486211 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7243e77fccb16648882ce56990af7eba31b7066bbf24ba8a7dc349a06d8bb2b9" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.489930 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0ee4-account-create-wtml5" event={"ID":"6d9ff352-f9b1-4458-841f-08b02c493ab1","Type":"ContainerDied","Data":"7f677e3f33e8c0f9aa8fec09d6dab17be74502487f47307e689b8b73105f08da"} Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.489961 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f677e3f33e8c0f9aa8fec09d6dab17be74502487f47307e689b8b73105f08da" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.489945 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0ee4-account-create-wtml5" Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.497711 5025 generic.go:334] "Generic (PLEG): container finished" podID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerID="36e4e361ce3d8db18b52ca38e04c61c3924df347b68392f61299866ad4104d7f" exitCode=0 Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.497739 5025 generic.go:334] "Generic (PLEG): container finished" podID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerID="c6b4eb768cafab526a9cf485b555c941d24f46b1fa5553d353ac3bf4c2e9aaa4" exitCode=143 Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.497784 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65e0b08-da77-408a-b6bd-7f6a4c177463","Type":"ContainerDied","Data":"36e4e361ce3d8db18b52ca38e04c61c3924df347b68392f61299866ad4104d7f"} Oct 07 08:34:41 crc kubenswrapper[5025]: I1007 08:34:41.497853 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65e0b08-da77-408a-b6bd-7f6a4c177463","Type":"ContainerDied","Data":"c6b4eb768cafab526a9cf485b555c941d24f46b1fa5553d353ac3bf4c2e9aaa4"} Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.701795 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.761750 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-scripts\") pod \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.762059 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-combined-ca-bundle\") pod \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.762247 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-credential-keys\") pod \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.762300 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-fernet-keys\") pod \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.762336 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkh5f\" (UniqueName: \"kubernetes.io/projected/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-kube-api-access-mkh5f\") pod \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.762397 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-config-data\") pod \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\" (UID: \"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.771788 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" (UID: "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.773015 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-scripts" (OuterVolumeSpecName: "scripts") pod "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" (UID: "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.776513 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-kube-api-access-mkh5f" (OuterVolumeSpecName: "kube-api-access-mkh5f") pod "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" (UID: "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc"). InnerVolumeSpecName "kube-api-access-mkh5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.776681 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" (UID: "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.823471 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-config-data" (OuterVolumeSpecName: "config-data") pod "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" (UID: "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.829804 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" (UID: "3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.865706 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.865747 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.865765 5025 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.865775 5025 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.865787 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkh5f\" (UniqueName: \"kubernetes.io/projected/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-kube-api-access-mkh5f\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.865797 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.896985 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.967387 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-logs\") pod \"e65e0b08-da77-408a-b6bd-7f6a4c177463\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.967443 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e65e0b08-da77-408a-b6bd-7f6a4c177463\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.967490 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-combined-ca-bundle\") pod \"e65e0b08-da77-408a-b6bd-7f6a4c177463\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.967535 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgnjj\" (UniqueName: \"kubernetes.io/projected/e65e0b08-da77-408a-b6bd-7f6a4c177463-kube-api-access-cgnjj\") pod \"e65e0b08-da77-408a-b6bd-7f6a4c177463\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.967589 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-config-data\") pod \"e65e0b08-da77-408a-b6bd-7f6a4c177463\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.967619 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-scripts\") pod \"e65e0b08-da77-408a-b6bd-7f6a4c177463\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.967733 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-httpd-run\") pod \"e65e0b08-da77-408a-b6bd-7f6a4c177463\" (UID: \"e65e0b08-da77-408a-b6bd-7f6a4c177463\") " Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.967956 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-logs" (OuterVolumeSpecName: "logs") pod "e65e0b08-da77-408a-b6bd-7f6a4c177463" (UID: "e65e0b08-da77-408a-b6bd-7f6a4c177463"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.968225 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.970122 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e65e0b08-da77-408a-b6bd-7f6a4c177463" (UID: "e65e0b08-da77-408a-b6bd-7f6a4c177463"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.972610 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-scripts" (OuterVolumeSpecName: "scripts") pod "e65e0b08-da77-408a-b6bd-7f6a4c177463" (UID: "e65e0b08-da77-408a-b6bd-7f6a4c177463"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.973766 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e65e0b08-da77-408a-b6bd-7f6a4c177463" (UID: "e65e0b08-da77-408a-b6bd-7f6a4c177463"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.979337 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e65e0b08-da77-408a-b6bd-7f6a4c177463-kube-api-access-cgnjj" (OuterVolumeSpecName: "kube-api-access-cgnjj") pod "e65e0b08-da77-408a-b6bd-7f6a4c177463" (UID: "e65e0b08-da77-408a-b6bd-7f6a4c177463"). InnerVolumeSpecName "kube-api-access-cgnjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:43 crc kubenswrapper[5025]: I1007 08:34:43.995838 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.010626 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e65e0b08-da77-408a-b6bd-7f6a4c177463" (UID: "e65e0b08-da77-408a-b6bd-7f6a4c177463"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.041508 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-config-data" (OuterVolumeSpecName: "config-data") pod "e65e0b08-da77-408a-b6bd-7f6a4c177463" (UID: "e65e0b08-da77-408a-b6bd-7f6a4c177463"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.069228 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-scripts\") pod \"85f45f95-1a5a-4463-86ca-3887295eb923\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.069288 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-combined-ca-bundle\") pod \"85f45f95-1a5a-4463-86ca-3887295eb923\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.069334 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"85f45f95-1a5a-4463-86ca-3887295eb923\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.069480 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-config-data\") pod \"85f45f95-1a5a-4463-86ca-3887295eb923\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.069637 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgz72\" (UniqueName: \"kubernetes.io/projected/85f45f95-1a5a-4463-86ca-3887295eb923-kube-api-access-rgz72\") pod \"85f45f95-1a5a-4463-86ca-3887295eb923\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.069704 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-httpd-run\") pod \"85f45f95-1a5a-4463-86ca-3887295eb923\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.069760 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-logs\") pod \"85f45f95-1a5a-4463-86ca-3887295eb923\" (UID: \"85f45f95-1a5a-4463-86ca-3887295eb923\") " Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.070192 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgnjj\" (UniqueName: \"kubernetes.io/projected/e65e0b08-da77-408a-b6bd-7f6a4c177463-kube-api-access-cgnjj\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.070219 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.070234 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.070246 5025 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65e0b08-da77-408a-b6bd-7f6a4c177463-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.070271 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.070284 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65e0b08-da77-408a-b6bd-7f6a4c177463-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.072136 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "85f45f95-1a5a-4463-86ca-3887295eb923" (UID: "85f45f95-1a5a-4463-86ca-3887295eb923"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.072335 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-logs" (OuterVolumeSpecName: "logs") pod "85f45f95-1a5a-4463-86ca-3887295eb923" (UID: "85f45f95-1a5a-4463-86ca-3887295eb923"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.074020 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f45f95-1a5a-4463-86ca-3887295eb923-kube-api-access-rgz72" (OuterVolumeSpecName: "kube-api-access-rgz72") pod "85f45f95-1a5a-4463-86ca-3887295eb923" (UID: "85f45f95-1a5a-4463-86ca-3887295eb923"). InnerVolumeSpecName "kube-api-access-rgz72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.076961 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "85f45f95-1a5a-4463-86ca-3887295eb923" (UID: "85f45f95-1a5a-4463-86ca-3887295eb923"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.087917 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-scripts" (OuterVolumeSpecName: "scripts") pod "85f45f95-1a5a-4463-86ca-3887295eb923" (UID: "85f45f95-1a5a-4463-86ca-3887295eb923"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.101027 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.107681 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85f45f95-1a5a-4463-86ca-3887295eb923" (UID: "85f45f95-1a5a-4463-86ca-3887295eb923"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.126484 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-config-data" (OuterVolumeSpecName: "config-data") pod "85f45f95-1a5a-4463-86ca-3887295eb923" (UID: "85f45f95-1a5a-4463-86ca-3887295eb923"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.171556 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.171601 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgz72\" (UniqueName: \"kubernetes.io/projected/85f45f95-1a5a-4463-86ca-3887295eb923-kube-api-access-rgz72\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.171615 5025 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.171625 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f45f95-1a5a-4463-86ca-3887295eb923-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.171636 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.171645 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.171657 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f45f95-1a5a-4463-86ca-3887295eb923-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.171698 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.190397 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.273419 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.532067 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerStarted","Data":"737e7948bdf87a15a50420e74839cfd300bc5557fa6b1e61dc0e5a1cf2a12956"} Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.533934 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-csqwc" event={"ID":"3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc","Type":"ContainerDied","Data":"8abaa31c888456c2b915c44c4a8ecb0aabf91f427fc7281273aa2aabc247409d"} Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.533970 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8abaa31c888456c2b915c44c4a8ecb0aabf91f427fc7281273aa2aabc247409d" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.534058 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-csqwc" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.539920 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65e0b08-da77-408a-b6bd-7f6a4c177463","Type":"ContainerDied","Data":"b01afe7ab768ca4880635326fdb03e9aec980c7d0c9a8254eb02510bba24b7a3"} Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.539946 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.539972 5025 scope.go:117] "RemoveContainer" containerID="36e4e361ce3d8db18b52ca38e04c61c3924df347b68392f61299866ad4104d7f" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.542556 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.542733 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f45f95-1a5a-4463-86ca-3887295eb923","Type":"ContainerDied","Data":"4e0338f67726b85bff417e3f33597558f7c6c95f2e074a4a5a6711fe2db13853"} Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.547482 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d2nc2" event={"ID":"5e980fa1-54dd-4f48-9a25-0b6090709927","Type":"ContainerStarted","Data":"44882153253c9e88843124cf0fa7157412b7cdee9616d43b7d45e44435fde32a"} Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.564043 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-d2nc2" podStartSLOduration=3.040815266 podStartE2EDuration="10.564020127s" podCreationTimestamp="2025-10-07 08:34:34 +0000 UTC" firstStartedPulling="2025-10-07 08:34:36.039618954 +0000 UTC m=+1082.848933098" lastFinishedPulling="2025-10-07 08:34:43.562823815 +0000 UTC m=+1090.372137959" observedRunningTime="2025-10-07 08:34:44.56378088 +0000 UTC m=+1091.373095034" watchObservedRunningTime="2025-10-07 08:34:44.564020127 +0000 UTC m=+1091.373334271" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.588234 5025 scope.go:117] "RemoveContainer" containerID="c6b4eb768cafab526a9cf485b555c941d24f46b1fa5553d353ac3bf4c2e9aaa4" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.610676 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.629814 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.637357 5025 scope.go:117] "RemoveContainer" containerID="47bd3592f39912d4c133fb4fc61f5143f266b8ead44f5c722bf4eeb4ff1808ea" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.639684 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.665104 5025 scope.go:117] "RemoveContainer" containerID="209fdcfb0eca6e3e34a309118aec4bcdac2e1add73d0bfa1577a5a3066c847da" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.668720 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.675639 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:34:44 crc kubenswrapper[5025]: E1007 08:34:44.676025 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f45f95-1a5a-4463-86ca-3887295eb923" containerName="glance-log" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676041 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f45f95-1a5a-4463-86ca-3887295eb923" containerName="glance-log" Oct 07 08:34:44 crc kubenswrapper[5025]: E1007 08:34:44.676059 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" containerName="keystone-bootstrap" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676065 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" containerName="keystone-bootstrap" Oct 07 08:34:44 crc kubenswrapper[5025]: E1007 08:34:44.676079 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9ff352-f9b1-4458-841f-08b02c493ab1" containerName="mariadb-account-create" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676085 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9ff352-f9b1-4458-841f-08b02c493ab1" containerName="mariadb-account-create" Oct 07 08:34:44 crc kubenswrapper[5025]: E1007 08:34:44.676094 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22dbbe5c-6887-419e-b067-7d400a3a99eb" containerName="init" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676099 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="22dbbe5c-6887-419e-b067-7d400a3a99eb" containerName="init" Oct 07 08:34:44 crc kubenswrapper[5025]: E1007 08:34:44.676113 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f45f95-1a5a-4463-86ca-3887295eb923" containerName="glance-httpd" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676119 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f45f95-1a5a-4463-86ca-3887295eb923" containerName="glance-httpd" Oct 07 08:34:44 crc kubenswrapper[5025]: E1007 08:34:44.676128 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c6fe82-d552-450a-845b-8c5da8bf5423" containerName="mariadb-account-create" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676135 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c6fe82-d552-450a-845b-8c5da8bf5423" containerName="mariadb-account-create" Oct 07 08:34:44 crc kubenswrapper[5025]: E1007 08:34:44.676145 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d202ac0c-e468-4038-8232-a4e1812d3698" containerName="mariadb-account-create" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676151 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d202ac0c-e468-4038-8232-a4e1812d3698" containerName="mariadb-account-create" Oct 07 08:34:44 crc kubenswrapper[5025]: E1007 08:34:44.676161 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerName="glance-httpd" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676167 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerName="glance-httpd" Oct 07 08:34:44 crc kubenswrapper[5025]: E1007 08:34:44.676178 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerName="glance-log" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676184 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerName="glance-log" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676328 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerName="glance-log" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676348 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f45f95-1a5a-4463-86ca-3887295eb923" containerName="glance-httpd" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676369 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c6fe82-d552-450a-845b-8c5da8bf5423" containerName="mariadb-account-create" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676375 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d202ac0c-e468-4038-8232-a4e1812d3698" containerName="mariadb-account-create" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676383 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9ff352-f9b1-4458-841f-08b02c493ab1" containerName="mariadb-account-create" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676394 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f45f95-1a5a-4463-86ca-3887295eb923" containerName="glance-log" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676400 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e65e0b08-da77-408a-b6bd-7f6a4c177463" containerName="glance-httpd" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676408 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" containerName="keystone-bootstrap" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.676415 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="22dbbe5c-6887-419e-b067-7d400a3a99eb" containerName="init" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.677281 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.679070 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.679391 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.680314 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.680450 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qxgpv" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.683149 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.684559 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.686634 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.686695 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.692894 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.714038 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.781953 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782004 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782040 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sgzp\" (UniqueName: \"kubernetes.io/projected/3de05e1a-7b03-4e41-9c47-4a2d135b3879-kube-api-access-9sgzp\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782068 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782112 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782137 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-logs\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782169 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-scripts\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782201 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782249 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782287 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-config-data\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782311 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-scripts\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782344 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-config-data\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782369 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782435 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782479 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgk2g\" (UniqueName: \"kubernetes.io/projected/674fb52e-75c9-4ee6-962b-7eec21242d05-kube-api-access-rgk2g\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.782510 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-logs\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.867949 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-csqwc"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.878435 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-csqwc"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.884105 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.884180 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.884222 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-config-data\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.884247 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-scripts\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.884281 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-config-data\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.884307 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.884454 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.889717 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.890037 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgk2g\" (UniqueName: \"kubernetes.io/projected/674fb52e-75c9-4ee6-962b-7eec21242d05-kube-api-access-rgk2g\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.890090 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-logs\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.890145 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.890165 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.890190 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sgzp\" (UniqueName: \"kubernetes.io/projected/3de05e1a-7b03-4e41-9c47-4a2d135b3879-kube-api-access-9sgzp\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.890218 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.890282 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.890303 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-logs\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.890342 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-scripts\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.889953 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.891702 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-scripts\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.892041 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-config-data\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.892269 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-logs\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.892337 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.892553 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.893743 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-config-data\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.893838 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-logs\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.901188 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.901352 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.904058 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-scripts\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.904906 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.912514 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgk2g\" (UniqueName: \"kubernetes.io/projected/674fb52e-75c9-4ee6-962b-7eec21242d05-kube-api-access-rgk2g\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.912574 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sgzp\" (UniqueName: \"kubernetes.io/projected/3de05e1a-7b03-4e41-9c47-4a2d135b3879-kube-api-access-9sgzp\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.913554 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.955787 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " pod="openstack/glance-default-external-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.972754 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.977798 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wxjnm"] Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.981388 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.983959 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.984366 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.985087 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 08:34:44 crc kubenswrapper[5025]: I1007 08:34:44.987063 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nfvg7" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.008693 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wxjnm"] Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.022673 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.038737 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.112607 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-fernet-keys\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.112663 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-config-data\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.112693 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpsjf\" (UniqueName: \"kubernetes.io/projected/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-kube-api-access-fpsjf\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.112715 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-scripts\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.112735 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-combined-ca-bundle\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.112786 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-credential-keys\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.213879 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-credential-keys\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.213984 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-fernet-keys\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.214021 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-config-data\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.214049 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpsjf\" (UniqueName: \"kubernetes.io/projected/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-kube-api-access-fpsjf\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.214070 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-scripts\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.214097 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-combined-ca-bundle\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.218173 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-fernet-keys\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.218203 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-credential-keys\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.218452 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-scripts\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.221420 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-combined-ca-bundle\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.222894 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-config-data\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.235718 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpsjf\" (UniqueName: \"kubernetes.io/projected/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-kube-api-access-fpsjf\") pod \"keystone-bootstrap-wxjnm\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.344369 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.461772 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.546395 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cr52x"] Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.546721 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" podUID="d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" containerName="dnsmasq-dns" containerID="cri-o://7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d" gracePeriod=10 Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.678377 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-2xx6h"] Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.684863 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.690195 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.690491 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gbght" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.690750 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.706970 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2xx6h"] Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.727024 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-combined-ca-bundle\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.727152 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-config-data\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.727432 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-db-sync-config-data\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.727464 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06e1a5d0-ce28-4495-937a-1aaccbcbb644-etc-machine-id\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.727681 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v78lh\" (UniqueName: \"kubernetes.io/projected/06e1a5d0-ce28-4495-937a-1aaccbcbb644-kube-api-access-v78lh\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.727778 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-scripts\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.829514 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-scripts\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.829653 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-combined-ca-bundle\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.829699 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-config-data\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.829736 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-db-sync-config-data\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.829757 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06e1a5d0-ce28-4495-937a-1aaccbcbb644-etc-machine-id\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.829813 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v78lh\" (UniqueName: \"kubernetes.io/projected/06e1a5d0-ce28-4495-937a-1aaccbcbb644-kube-api-access-v78lh\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.831075 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06e1a5d0-ce28-4495-937a-1aaccbcbb644-etc-machine-id\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.836185 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-db-sync-config-data\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.838902 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-config-data\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.842316 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-combined-ca-bundle\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.843101 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-scripts\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.849873 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v78lh\" (UniqueName: \"kubernetes.io/projected/06e1a5d0-ce28-4495-937a-1aaccbcbb644-kube-api-access-v78lh\") pod \"cinder-db-sync-2xx6h\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.935410 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc" path="/var/lib/kubelet/pods/3ac8a5e2-e01c-4a7b-a98b-0a882a5620dc/volumes" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.945565 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f45f95-1a5a-4463-86ca-3887295eb923" path="/var/lib/kubelet/pods/85f45f95-1a5a-4463-86ca-3887295eb923/volumes" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.946643 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e65e0b08-da77-408a-b6bd-7f6a4c177463" path="/var/lib/kubelet/pods/e65e0b08-da77-408a-b6bd-7f6a4c177463/volumes" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.947592 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jl8nn"] Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.949221 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.951241 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.954322 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2nwlt" Oct 07 08:34:45 crc kubenswrapper[5025]: I1007 08:34:45.960601 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jl8nn"] Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.021440 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.033991 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-db-sync-config-data\") pod \"barbican-db-sync-jl8nn\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.034084 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njh6h\" (UniqueName: \"kubernetes.io/projected/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-kube-api-access-njh6h\") pod \"barbican-db-sync-jl8nn\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.034140 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-combined-ca-bundle\") pod \"barbican-db-sync-jl8nn\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.055622 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-z4bnz"] Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.056695 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.059882 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.060022 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.059905 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wzl6k" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.085212 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z4bnz"] Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.105268 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wxjnm"] Oct 07 08:34:46 crc kubenswrapper[5025]: W1007 08:34:46.119355 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4b8747a_af4d_42a8_8caf_5ea8b7ff06d9.slice/crio-c6c9dd183f39199398f85472e202cbc34537c5b1890c3d1bf1dbfbbe78209ced WatchSource:0}: Error finding container c6c9dd183f39199398f85472e202cbc34537c5b1890c3d1bf1dbfbbe78209ced: Status 404 returned error can't find the container with id c6c9dd183f39199398f85472e202cbc34537c5b1890c3d1bf1dbfbbe78209ced Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.135965 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-config\") pod \"neutron-db-sync-z4bnz\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.136045 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-db-sync-config-data\") pod \"barbican-db-sync-jl8nn\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.136112 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njh6h\" (UniqueName: \"kubernetes.io/projected/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-kube-api-access-njh6h\") pod \"barbican-db-sync-jl8nn\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.136152 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-combined-ca-bundle\") pod \"neutron-db-sync-z4bnz\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.136184 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k75bv\" (UniqueName: \"kubernetes.io/projected/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-kube-api-access-k75bv\") pod \"neutron-db-sync-z4bnz\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.136207 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-combined-ca-bundle\") pod \"barbican-db-sync-jl8nn\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.140913 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-combined-ca-bundle\") pod \"barbican-db-sync-jl8nn\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.140934 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-db-sync-config-data\") pod \"barbican-db-sync-jl8nn\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.158416 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njh6h\" (UniqueName: \"kubernetes.io/projected/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-kube-api-access-njh6h\") pod \"barbican-db-sync-jl8nn\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.190825 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:34:46 crc kubenswrapper[5025]: W1007 08:34:46.236454 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674fb52e_75c9_4ee6_962b_7eec21242d05.slice/crio-c2b1daced2f6f119257331765141a3eafb847d7915d6d3fe42d89ac0e7215a84 WatchSource:0}: Error finding container c2b1daced2f6f119257331765141a3eafb847d7915d6d3fe42d89ac0e7215a84: Status 404 returned error can't find the container with id c2b1daced2f6f119257331765141a3eafb847d7915d6d3fe42d89ac0e7215a84 Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.237443 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-config\") pod \"neutron-db-sync-z4bnz\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.237587 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-combined-ca-bundle\") pod \"neutron-db-sync-z4bnz\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.237990 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k75bv\" (UniqueName: \"kubernetes.io/projected/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-kube-api-access-k75bv\") pod \"neutron-db-sync-z4bnz\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.242646 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-combined-ca-bundle\") pod \"neutron-db-sync-z4bnz\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.246353 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-config\") pod \"neutron-db-sync-z4bnz\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.256810 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k75bv\" (UniqueName: \"kubernetes.io/projected/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-kube-api-access-k75bv\") pod \"neutron-db-sync-z4bnz\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.279211 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.344928 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.381482 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.403015 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.440974 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-nb\") pod \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.443732 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-config\") pod \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.443834 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-swift-storage-0\") pod \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.444005 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-sb\") pod \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.444041 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-svc\") pod \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.444097 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfhnm\" (UniqueName: \"kubernetes.io/projected/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-kube-api-access-bfhnm\") pod \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\" (UID: \"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9\") " Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.457686 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-kube-api-access-bfhnm" (OuterVolumeSpecName: "kube-api-access-bfhnm") pod "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" (UID: "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9"). InnerVolumeSpecName "kube-api-access-bfhnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.545559 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" (UID: "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.549173 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.549199 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfhnm\" (UniqueName: \"kubernetes.io/projected/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-kube-api-access-bfhnm\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.574482 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-config" (OuterVolumeSpecName: "config") pod "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" (UID: "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.609365 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" (UID: "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.610352 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" (UID: "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.651634 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.651666 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.651675 5025 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.660707 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2xx6h"] Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.663848 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" (UID: "d3ddc8c0-beba-4616-b90e-0c8384cfd4f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.678869 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wxjnm" event={"ID":"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9","Type":"ContainerStarted","Data":"be9144fe694a0b08748194b5643337f23d2434df55bb180b2c8ade9c750a985a"} Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.678910 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wxjnm" event={"ID":"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9","Type":"ContainerStarted","Data":"c6c9dd183f39199398f85472e202cbc34537c5b1890c3d1bf1dbfbbe78209ced"} Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.685241 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de05e1a-7b03-4e41-9c47-4a2d135b3879","Type":"ContainerStarted","Data":"42c59bf8b6aa1fce84b828935f1a920b58f292ab778f28b80de66f8efb513604"} Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.688720 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674fb52e-75c9-4ee6-962b-7eec21242d05","Type":"ContainerStarted","Data":"c2b1daced2f6f119257331765141a3eafb847d7915d6d3fe42d89ac0e7215a84"} Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.707805 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wxjnm" podStartSLOduration=2.707785371 podStartE2EDuration="2.707785371s" podCreationTimestamp="2025-10-07 08:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:46.70038387 +0000 UTC m=+1093.509698014" watchObservedRunningTime="2025-10-07 08:34:46.707785371 +0000 UTC m=+1093.517099505" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.745721 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerStarted","Data":"192a322439949335b4f9eade6df4da3ee83390e6d965ca42196d09c64b3d7944"} Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.752967 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.760619 5025 generic.go:334] "Generic (PLEG): container finished" podID="d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" containerID="7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d" exitCode=0 Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.760656 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" event={"ID":"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9","Type":"ContainerDied","Data":"7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d"} Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.760678 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" event={"ID":"d3ddc8c0-beba-4616-b90e-0c8384cfd4f9","Type":"ContainerDied","Data":"59c4121756e99bb8de5641df07c11411ad87651993857002a1cc14ad9e75efc3"} Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.760695 5025 scope.go:117] "RemoveContainer" containerID="7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.760811 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-cr52x" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.828872 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cr52x"] Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.833861 5025 scope.go:117] "RemoveContainer" containerID="2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.842075 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-cr52x"] Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.867878 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jl8nn"] Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.910993 5025 scope.go:117] "RemoveContainer" containerID="7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d" Oct 07 08:34:46 crc kubenswrapper[5025]: E1007 08:34:46.914895 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d\": container with ID starting with 7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d not found: ID does not exist" containerID="7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.914948 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d"} err="failed to get container status \"7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d\": rpc error: code = NotFound desc = could not find container \"7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d\": container with ID starting with 7b007c19f3d91f176887dd68c56f6c0fbabbeffd7e055c01c3e3d505f64afb6d not found: ID does not exist" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.914982 5025 scope.go:117] "RemoveContainer" containerID="2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8" Oct 07 08:34:46 crc kubenswrapper[5025]: E1007 08:34:46.915394 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8\": container with ID starting with 2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8 not found: ID does not exist" containerID="2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8" Oct 07 08:34:46 crc kubenswrapper[5025]: I1007 08:34:46.915471 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8"} err="failed to get container status \"2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8\": rpc error: code = NotFound desc = could not find container \"2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8\": container with ID starting with 2fd0b0da34b6ea8804c8550868da4cc1c22de8217c7d20286a63e7f70d44ddd8 not found: ID does not exist" Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.083789 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z4bnz"] Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.783635 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674fb52e-75c9-4ee6-962b-7eec21242d05","Type":"ContainerStarted","Data":"7b804e5864cae65ced24516a277cd7996e7bedb1914c6e4624a7ffbd3aedf8ad"} Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.794236 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z4bnz" event={"ID":"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5","Type":"ContainerStarted","Data":"c660c36e2bb2d6b75ff1117399aae70f2a2e059a387005f729f8ffcb0f430141"} Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.794312 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z4bnz" event={"ID":"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5","Type":"ContainerStarted","Data":"696408c2b4efe3a779760cdb734a5eba38d738b9317d7907ab42d9df576b31e4"} Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.800129 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2xx6h" event={"ID":"06e1a5d0-ce28-4495-937a-1aaccbcbb644","Type":"ContainerStarted","Data":"2f515b5e127983cd6e83a60bd7de67fd525b65f4f903dfd2529176c83e5a65d1"} Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.814495 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-z4bnz" podStartSLOduration=1.814474933 podStartE2EDuration="1.814474933s" podCreationTimestamp="2025-10-07 08:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:47.809764166 +0000 UTC m=+1094.619078310" watchObservedRunningTime="2025-10-07 08:34:47.814474933 +0000 UTC m=+1094.623789077" Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.817776 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jl8nn" event={"ID":"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11","Type":"ContainerStarted","Data":"4dbb7df21a51775739fea8bc31dc5ef7049d2a6be485413873749c23d623d6e8"} Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.821717 5025 generic.go:334] "Generic (PLEG): container finished" podID="5e980fa1-54dd-4f48-9a25-0b6090709927" containerID="44882153253c9e88843124cf0fa7157412b7cdee9616d43b7d45e44435fde32a" exitCode=0 Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.821882 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d2nc2" event={"ID":"5e980fa1-54dd-4f48-9a25-0b6090709927","Type":"ContainerDied","Data":"44882153253c9e88843124cf0fa7157412b7cdee9616d43b7d45e44435fde32a"} Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.826045 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de05e1a-7b03-4e41-9c47-4a2d135b3879","Type":"ContainerStarted","Data":"381e8f7be4247a42fa5605b246ada82812656fd0901e3a380d230417aae73fd3"} Oct 07 08:34:47 crc kubenswrapper[5025]: I1007 08:34:47.927258 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" path="/var/lib/kubelet/pods/d3ddc8c0-beba-4616-b90e-0c8384cfd4f9/volumes" Oct 07 08:34:48 crc kubenswrapper[5025]: I1007 08:34:48.835718 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de05e1a-7b03-4e41-9c47-4a2d135b3879","Type":"ContainerStarted","Data":"ac9190177a046f25b19b1e2b7c4d077ec8a723e5d8034bb239d27ef5b5cdbc8f"} Oct 07 08:34:48 crc kubenswrapper[5025]: I1007 08:34:48.840984 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674fb52e-75c9-4ee6-962b-7eec21242d05","Type":"ContainerStarted","Data":"bbb4e293d104b5e24464db46510950d8cbda476949190331ff3bb796a28a78b7"} Oct 07 08:34:48 crc kubenswrapper[5025]: I1007 08:34:48.866159 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.86614162 podStartE2EDuration="4.86614162s" podCreationTimestamp="2025-10-07 08:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:48.853243867 +0000 UTC m=+1095.662558001" watchObservedRunningTime="2025-10-07 08:34:48.86614162 +0000 UTC m=+1095.675455764" Oct 07 08:34:48 crc kubenswrapper[5025]: I1007 08:34:48.881449 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.881428777 podStartE2EDuration="4.881428777s" podCreationTimestamp="2025-10-07 08:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:34:48.874404847 +0000 UTC m=+1095.683718981" watchObservedRunningTime="2025-10-07 08:34:48.881428777 +0000 UTC m=+1095.690742921" Oct 07 08:34:49 crc kubenswrapper[5025]: E1007 08:34:49.961575 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13598166_30eb_43b2_8a13_2e2ca72f58e9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4b8747a_af4d_42a8_8caf_5ea8b7ff06d9.slice/crio-conmon-be9144fe694a0b08748194b5643337f23d2434df55bb180b2c8ade9c750a985a.scope\": RecentStats: unable to find data in memory cache]" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.458201 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.625197 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-config-data\") pod \"5e980fa1-54dd-4f48-9a25-0b6090709927\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.625345 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e980fa1-54dd-4f48-9a25-0b6090709927-logs\") pod \"5e980fa1-54dd-4f48-9a25-0b6090709927\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.625461 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-combined-ca-bundle\") pod \"5e980fa1-54dd-4f48-9a25-0b6090709927\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.625497 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-scripts\") pod \"5e980fa1-54dd-4f48-9a25-0b6090709927\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.625628 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw7rp\" (UniqueName: \"kubernetes.io/projected/5e980fa1-54dd-4f48-9a25-0b6090709927-kube-api-access-rw7rp\") pod \"5e980fa1-54dd-4f48-9a25-0b6090709927\" (UID: \"5e980fa1-54dd-4f48-9a25-0b6090709927\") " Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.625770 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e980fa1-54dd-4f48-9a25-0b6090709927-logs" (OuterVolumeSpecName: "logs") pod "5e980fa1-54dd-4f48-9a25-0b6090709927" (UID: "5e980fa1-54dd-4f48-9a25-0b6090709927"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.626053 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e980fa1-54dd-4f48-9a25-0b6090709927-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.631352 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-scripts" (OuterVolumeSpecName: "scripts") pod "5e980fa1-54dd-4f48-9a25-0b6090709927" (UID: "5e980fa1-54dd-4f48-9a25-0b6090709927"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.631652 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e980fa1-54dd-4f48-9a25-0b6090709927-kube-api-access-rw7rp" (OuterVolumeSpecName: "kube-api-access-rw7rp") pod "5e980fa1-54dd-4f48-9a25-0b6090709927" (UID: "5e980fa1-54dd-4f48-9a25-0b6090709927"). InnerVolumeSpecName "kube-api-access-rw7rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.654503 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-config-data" (OuterVolumeSpecName: "config-data") pod "5e980fa1-54dd-4f48-9a25-0b6090709927" (UID: "5e980fa1-54dd-4f48-9a25-0b6090709927"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.656927 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e980fa1-54dd-4f48-9a25-0b6090709927" (UID: "5e980fa1-54dd-4f48-9a25-0b6090709927"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.727857 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.727888 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.727898 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw7rp\" (UniqueName: \"kubernetes.io/projected/5e980fa1-54dd-4f48-9a25-0b6090709927-kube-api-access-rw7rp\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.727909 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e980fa1-54dd-4f48-9a25-0b6090709927-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.867029 5025 generic.go:334] "Generic (PLEG): container finished" podID="d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" containerID="be9144fe694a0b08748194b5643337f23d2434df55bb180b2c8ade9c750a985a" exitCode=0 Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.867105 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wxjnm" event={"ID":"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9","Type":"ContainerDied","Data":"be9144fe694a0b08748194b5643337f23d2434df55bb180b2c8ade9c750a985a"} Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.869200 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d2nc2" event={"ID":"5e980fa1-54dd-4f48-9a25-0b6090709927","Type":"ContainerDied","Data":"384fcf60bc3104fb78106106c7d0a7de5d3bb364bb1798f6817264afb3e4877a"} Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.869228 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384fcf60bc3104fb78106106c7d0a7de5d3bb364bb1798f6817264afb3e4877a" Oct 07 08:34:50 crc kubenswrapper[5025]: I1007 08:34:50.869254 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d2nc2" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.559175 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-684bd87b6d-w58z5"] Oct 07 08:34:51 crc kubenswrapper[5025]: E1007 08:34:51.559645 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e980fa1-54dd-4f48-9a25-0b6090709927" containerName="placement-db-sync" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.559666 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e980fa1-54dd-4f48-9a25-0b6090709927" containerName="placement-db-sync" Oct 07 08:34:51 crc kubenswrapper[5025]: E1007 08:34:51.559685 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" containerName="init" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.559693 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" containerName="init" Oct 07 08:34:51 crc kubenswrapper[5025]: E1007 08:34:51.559713 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" containerName="dnsmasq-dns" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.559722 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" containerName="dnsmasq-dns" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.559920 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e980fa1-54dd-4f48-9a25-0b6090709927" containerName="placement-db-sync" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.559944 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ddc8c0-beba-4616-b90e-0c8384cfd4f9" containerName="dnsmasq-dns" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.561936 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.565784 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.566095 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.566223 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.566346 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.566381 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hfmd7" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.583734 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-684bd87b6d-w58z5"] Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.745519 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xgrg\" (UniqueName: \"kubernetes.io/projected/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-kube-api-access-4xgrg\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.745636 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-combined-ca-bundle\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.745666 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-logs\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.745705 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-internal-tls-certs\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.745741 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-scripts\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.745805 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-public-tls-certs\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.745971 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-config-data\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.847340 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-public-tls-certs\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.847443 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-config-data\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.847506 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xgrg\" (UniqueName: \"kubernetes.io/projected/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-kube-api-access-4xgrg\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.847555 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-combined-ca-bundle\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.847574 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-logs\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.847608 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-internal-tls-certs\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.847634 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-scripts\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.848116 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-logs\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.852366 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-public-tls-certs\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.859102 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-combined-ca-bundle\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.865291 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-scripts\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.865561 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-internal-tls-certs\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.868244 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xgrg\" (UniqueName: \"kubernetes.io/projected/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-kube-api-access-4xgrg\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.870288 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-config-data\") pod \"placement-684bd87b6d-w58z5\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:51 crc kubenswrapper[5025]: I1007 08:34:51.881992 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.544860 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.679142 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-scripts\") pod \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.679352 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-combined-ca-bundle\") pod \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.679420 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-credential-keys\") pod \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.679453 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-config-data\") pod \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.679490 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-fernet-keys\") pod \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.679557 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpsjf\" (UniqueName: \"kubernetes.io/projected/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-kube-api-access-fpsjf\") pod \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\" (UID: \"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9\") " Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.684416 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" (UID: "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.684863 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-scripts" (OuterVolumeSpecName: "scripts") pod "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" (UID: "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.688656 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" (UID: "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.704407 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-config-data" (OuterVolumeSpecName: "config-data") pod "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" (UID: "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.708386 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-kube-api-access-fpsjf" (OuterVolumeSpecName: "kube-api-access-fpsjf") pod "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" (UID: "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9"). InnerVolumeSpecName "kube-api-access-fpsjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.713323 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" (UID: "d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.782149 5025 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.782189 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.782197 5025 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.782208 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpsjf\" (UniqueName: \"kubernetes.io/projected/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-kube-api-access-fpsjf\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.782219 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.782227 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.897966 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wxjnm" event={"ID":"d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9","Type":"ContainerDied","Data":"c6c9dd183f39199398f85472e202cbc34537c5b1890c3d1bf1dbfbbe78209ced"} Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.898021 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6c9dd183f39199398f85472e202cbc34537c5b1890c3d1bf1dbfbbe78209ced" Oct 07 08:34:53 crc kubenswrapper[5025]: I1007 08:34:53.898118 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wxjnm" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.732626 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6f574885d6-269v8"] Oct 07 08:34:54 crc kubenswrapper[5025]: E1007 08:34:54.733020 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" containerName="keystone-bootstrap" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.733038 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" containerName="keystone-bootstrap" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.733233 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" containerName="keystone-bootstrap" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.733849 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.738026 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nfvg7" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.738468 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.738642 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.738724 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.738820 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.738842 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.746161 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f574885d6-269v8"] Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.903009 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-internal-tls-certs\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.903103 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-combined-ca-bundle\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.903228 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjnd\" (UniqueName: \"kubernetes.io/projected/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-kube-api-access-tsjnd\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.903268 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-config-data\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.903296 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-fernet-keys\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.903328 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-public-tls-certs\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.903350 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-scripts\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:54 crc kubenswrapper[5025]: I1007 08:34:54.903417 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-credential-keys\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.004949 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-internal-tls-certs\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.005050 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-combined-ca-bundle\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.005157 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjnd\" (UniqueName: \"kubernetes.io/projected/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-kube-api-access-tsjnd\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.005200 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-config-data\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.005230 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-fernet-keys\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.005266 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-public-tls-certs\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.005286 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-scripts\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.005320 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-credential-keys\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.011134 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-public-tls-certs\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.011453 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-credential-keys\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.012009 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-fernet-keys\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.012048 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-internal-tls-certs\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.019145 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-scripts\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.021846 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-combined-ca-bundle\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.022739 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjnd\" (UniqueName: \"kubernetes.io/projected/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-kube-api-access-tsjnd\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.022910 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.023273 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.027752 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-config-data\") pod \"keystone-6f574885d6-269v8\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.040364 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.040408 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.052475 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.053481 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.072398 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.074630 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.092339 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.924479 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.924528 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.924558 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 08:34:55 crc kubenswrapper[5025]: I1007 08:34:55.924572 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 08:34:57 crc kubenswrapper[5025]: I1007 08:34:57.882032 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 08:34:57 crc kubenswrapper[5025]: I1007 08:34:57.908611 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 08:34:57 crc kubenswrapper[5025]: I1007 08:34:57.942859 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:34:57 crc kubenswrapper[5025]: I1007 08:34:57.942886 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:34:58 crc kubenswrapper[5025]: I1007 08:34:58.194664 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 08:34:58 crc kubenswrapper[5025]: I1007 08:34:58.196934 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 08:35:00 crc kubenswrapper[5025]: E1007 08:35:00.184477 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13598166_30eb_43b2_8a13_2e2ca72f58e9.slice\": RecentStats: unable to find data in memory cache]" Oct 07 08:35:07 crc kubenswrapper[5025]: E1007 08:35:07.753426 5025 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 07 08:35:07 crc kubenswrapper[5025]: E1007 08:35:07.754446 5025 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v78lh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-2xx6h_openstack(06e1a5d0-ce28-4495-937a-1aaccbcbb644): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 08:35:07 crc kubenswrapper[5025]: E1007 08:35:07.755793 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-2xx6h" podUID="06e1a5d0-ce28-4495-937a-1aaccbcbb644" Oct 07 08:35:08 crc kubenswrapper[5025]: I1007 08:35:08.037892 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerStarted","Data":"55b9ead6cbc3246ada2c749e825680008bf0b9eb8b6c9ba49075a691ca9d6c03"} Oct 07 08:35:08 crc kubenswrapper[5025]: I1007 08:35:08.040440 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jl8nn" event={"ID":"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11","Type":"ContainerStarted","Data":"562e7d2d588d95b2f6a1c903ed0b4db2a9e3f4bd3a63c1bfff1f21cc4e9ec999"} Oct 07 08:35:08 crc kubenswrapper[5025]: E1007 08:35:08.043336 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-2xx6h" podUID="06e1a5d0-ce28-4495-937a-1aaccbcbb644" Oct 07 08:35:08 crc kubenswrapper[5025]: I1007 08:35:08.061420 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jl8nn" podStartSLOduration=2.203520304 podStartE2EDuration="23.061401481s" podCreationTimestamp="2025-10-07 08:34:45 +0000 UTC" firstStartedPulling="2025-10-07 08:34:46.886921859 +0000 UTC m=+1093.696236003" lastFinishedPulling="2025-10-07 08:35:07.744803026 +0000 UTC m=+1114.554117180" observedRunningTime="2025-10-07 08:35:08.055908418 +0000 UTC m=+1114.865222562" watchObservedRunningTime="2025-10-07 08:35:08.061401481 +0000 UTC m=+1114.870715625" Oct 07 08:35:08 crc kubenswrapper[5025]: I1007 08:35:08.249988 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-684bd87b6d-w58z5"] Oct 07 08:35:08 crc kubenswrapper[5025]: W1007 08:35:08.261883 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf09e8173_e1c3_4a48_8bc6_e9205b1dd6d8.slice/crio-eefef5616739121371aad595359fd3442cec2ea8a836a569004f8026d54f8d08 WatchSource:0}: Error finding container eefef5616739121371aad595359fd3442cec2ea8a836a569004f8026d54f8d08: Status 404 returned error can't find the container with id eefef5616739121371aad595359fd3442cec2ea8a836a569004f8026d54f8d08 Oct 07 08:35:08 crc kubenswrapper[5025]: I1007 08:35:08.351166 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f574885d6-269v8"] Oct 07 08:35:08 crc kubenswrapper[5025]: W1007 08:35:08.365710 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4d92a87_d0f4_40c4_a370_1bf0fb8fe5ef.slice/crio-f12618ad34f2d0b545e6a747fc1c02cb8856d3c51ea493bbea188fd7690bd34e WatchSource:0}: Error finding container f12618ad34f2d0b545e6a747fc1c02cb8856d3c51ea493bbea188fd7690bd34e: Status 404 returned error can't find the container with id f12618ad34f2d0b545e6a747fc1c02cb8856d3c51ea493bbea188fd7690bd34e Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.049003 5025 generic.go:334] "Generic (PLEG): container finished" podID="bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5" containerID="c660c36e2bb2d6b75ff1117399aae70f2a2e059a387005f729f8ffcb0f430141" exitCode=0 Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.049101 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z4bnz" event={"ID":"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5","Type":"ContainerDied","Data":"c660c36e2bb2d6b75ff1117399aae70f2a2e059a387005f729f8ffcb0f430141"} Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.051944 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-684bd87b6d-w58z5" event={"ID":"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8","Type":"ContainerStarted","Data":"91ba013dc33b4d2f92059c4d4c66f039c2cb58683016a2882a32f59cc78bb607"} Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.051971 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-684bd87b6d-w58z5" event={"ID":"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8","Type":"ContainerStarted","Data":"9226a61a229bb35a76b9b1993c0b6dacb2bab79faa138dc6bf20e3d356c1fca2"} Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.051983 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-684bd87b6d-w58z5" event={"ID":"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8","Type":"ContainerStarted","Data":"eefef5616739121371aad595359fd3442cec2ea8a836a569004f8026d54f8d08"} Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.052744 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.052776 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.055529 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f574885d6-269v8" event={"ID":"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef","Type":"ContainerStarted","Data":"765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c"} Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.055667 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f574885d6-269v8" event={"ID":"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef","Type":"ContainerStarted","Data":"f12618ad34f2d0b545e6a747fc1c02cb8856d3c51ea493bbea188fd7690bd34e"} Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.055687 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.082521 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6f574885d6-269v8" podStartSLOduration=15.082504302 podStartE2EDuration="15.082504302s" podCreationTimestamp="2025-10-07 08:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:09.081054156 +0000 UTC m=+1115.890368310" watchObservedRunningTime="2025-10-07 08:35:09.082504302 +0000 UTC m=+1115.891818446" Oct 07 08:35:09 crc kubenswrapper[5025]: I1007 08:35:09.104820 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-684bd87b6d-w58z5" podStartSLOduration=18.104795874 podStartE2EDuration="18.104795874s" podCreationTimestamp="2025-10-07 08:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:09.099050784 +0000 UTC m=+1115.908364928" watchObservedRunningTime="2025-10-07 08:35:09.104795874 +0000 UTC m=+1115.914110018" Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.365974 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.388158 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-config\") pod \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.388365 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-combined-ca-bundle\") pod \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.388466 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k75bv\" (UniqueName: \"kubernetes.io/projected/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-kube-api-access-k75bv\") pod \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\" (UID: \"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5\") " Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.400997 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-kube-api-access-k75bv" (OuterVolumeSpecName: "kube-api-access-k75bv") pod "bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5" (UID: "bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5"). InnerVolumeSpecName "kube-api-access-k75bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.414783 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5" (UID: "bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.416876 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-config" (OuterVolumeSpecName: "config") pod "bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5" (UID: "bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:10 crc kubenswrapper[5025]: E1007 08:35:10.441820 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13598166_30eb_43b2_8a13_2e2ca72f58e9.slice\": RecentStats: unable to find data in memory cache]" Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.491231 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.491582 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k75bv\" (UniqueName: \"kubernetes.io/projected/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-kube-api-access-k75bv\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:10 crc kubenswrapper[5025]: I1007 08:35:10.491602 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.078054 5025 generic.go:334] "Generic (PLEG): container finished" podID="bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11" containerID="562e7d2d588d95b2f6a1c903ed0b4db2a9e3f4bd3a63c1bfff1f21cc4e9ec999" exitCode=0 Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.078106 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jl8nn" event={"ID":"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11","Type":"ContainerDied","Data":"562e7d2d588d95b2f6a1c903ed0b4db2a9e3f4bd3a63c1bfff1f21cc4e9ec999"} Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.081842 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z4bnz" event={"ID":"bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5","Type":"ContainerDied","Data":"696408c2b4efe3a779760cdb734a5eba38d738b9317d7907ab42d9df576b31e4"} Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.081888 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="696408c2b4efe3a779760cdb734a5eba38d738b9317d7907ab42d9df576b31e4" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.081862 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z4bnz" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.226983 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vlnnh"] Oct 07 08:35:11 crc kubenswrapper[5025]: E1007 08:35:11.227435 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5" containerName="neutron-db-sync" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.227460 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5" containerName="neutron-db-sync" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.227679 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5" containerName="neutron-db-sync" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.228790 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.279989 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vlnnh"] Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.305868 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.313417 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.313752 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.313877 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8228t\" (UniqueName: \"kubernetes.io/projected/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-kube-api-access-8228t\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.314024 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.314239 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-config\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.348171 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b6897fb44-5d68h"] Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.351274 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.353833 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.353999 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.354314 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.354591 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wzl6k" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.371225 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b6897fb44-5d68h"] Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.416972 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417099 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5kfz\" (UniqueName: \"kubernetes.io/projected/c0b57ea4-f1b6-47f3-adef-97c26b470477-kube-api-access-h5kfz\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417127 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-config\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417156 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-httpd-config\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417290 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-combined-ca-bundle\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417336 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417378 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417431 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-ovndb-tls-certs\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417474 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417507 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8228t\" (UniqueName: \"kubernetes.io/projected/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-kube-api-access-8228t\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.417565 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-config\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.418014 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.418342 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.418359 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.418439 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-config\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.419445 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.436734 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8228t\" (UniqueName: \"kubernetes.io/projected/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-kube-api-access-8228t\") pod \"dnsmasq-dns-5ccc5c4795-vlnnh\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.520185 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5kfz\" (UniqueName: \"kubernetes.io/projected/c0b57ea4-f1b6-47f3-adef-97c26b470477-kube-api-access-h5kfz\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.520244 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-httpd-config\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.520282 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-combined-ca-bundle\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.520313 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-ovndb-tls-certs\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.520344 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-config\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.524839 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-combined-ca-bundle\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.527476 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-ovndb-tls-certs\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.534089 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-httpd-config\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.540212 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-config\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.540775 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5kfz\" (UniqueName: \"kubernetes.io/projected/c0b57ea4-f1b6-47f3-adef-97c26b470477-kube-api-access-h5kfz\") pod \"neutron-6b6897fb44-5d68h\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.569474 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:11 crc kubenswrapper[5025]: I1007 08:35:11.675015 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.318203 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5899768569-tz2vj"] Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.320461 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.323254 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.327308 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.328153 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5899768569-tz2vj"] Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.356939 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-combined-ca-bundle\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.356986 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skk72\" (UniqueName: \"kubernetes.io/projected/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-kube-api-access-skk72\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.357065 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-public-tls-certs\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.357092 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-httpd-config\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.357128 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-config\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.357179 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-ovndb-tls-certs\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.357203 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-internal-tls-certs\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.461741 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-public-tls-certs\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.461784 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-httpd-config\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.461819 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-config\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.461863 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-ovndb-tls-certs\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.461896 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-internal-tls-certs\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.461948 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-combined-ca-bundle\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.461970 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skk72\" (UniqueName: \"kubernetes.io/projected/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-kube-api-access-skk72\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.468410 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-config\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.468811 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-ovndb-tls-certs\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.469234 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-public-tls-certs\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.469352 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-combined-ca-bundle\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.475747 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-internal-tls-certs\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.479809 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-httpd-config\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.482309 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skk72\" (UniqueName: \"kubernetes.io/projected/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-kube-api-access-skk72\") pod \"neutron-5899768569-tz2vj\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:13 crc kubenswrapper[5025]: I1007 08:35:13.651583 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.546738 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.610934 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njh6h\" (UniqueName: \"kubernetes.io/projected/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-kube-api-access-njh6h\") pod \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.610992 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-db-sync-config-data\") pod \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.611059 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-combined-ca-bundle\") pod \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\" (UID: \"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11\") " Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.639729 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11" (UID: "bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.639798 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-kube-api-access-njh6h" (OuterVolumeSpecName: "kube-api-access-njh6h") pod "bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11" (UID: "bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11"). InnerVolumeSpecName "kube-api-access-njh6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.651448 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11" (UID: "bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.713155 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njh6h\" (UniqueName: \"kubernetes.io/projected/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-kube-api-access-njh6h\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.713202 5025 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:15 crc kubenswrapper[5025]: I1007 08:35:15.713220 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.134042 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jl8nn" event={"ID":"bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11","Type":"ContainerDied","Data":"4dbb7df21a51775739fea8bc31dc5ef7049d2a6be485413873749c23d623d6e8"} Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.134412 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dbb7df21a51775739fea8bc31dc5ef7049d2a6be485413873749c23d623d6e8" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.134125 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jl8nn" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.820450 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c8774d5b7-qj8g5"] Oct 07 08:35:16 crc kubenswrapper[5025]: E1007 08:35:16.821781 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11" containerName="barbican-db-sync" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.821802 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11" containerName="barbican-db-sync" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.822124 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11" containerName="barbican-db-sync" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.827425 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.830085 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.834677 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c8774d5b7-qj8g5"] Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.843038 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.843329 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2nwlt" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.845638 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5495b78bc8-wbf5t"] Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.847074 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.852164 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.856503 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5495b78bc8-wbf5t"] Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.920880 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vlnnh"] Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.934443 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcrdg\" (UniqueName: \"kubernetes.io/projected/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-kube-api-access-dcrdg\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.934950 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxs84\" (UniqueName: \"kubernetes.io/projected/51f36c18-61cd-43d1-98a6-b569197c9382-kube-api-access-xxs84\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.935009 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-combined-ca-bundle\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.935047 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.935070 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data-custom\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.935091 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data-custom\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.935122 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-combined-ca-bundle\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.935142 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-logs\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.935196 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.935228 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f36c18-61cd-43d1-98a6-b569197c9382-logs\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.975469 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vb7rc"] Oct 07 08:35:16 crc kubenswrapper[5025]: I1007 08:35:16.989035 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.026834 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vb7rc"] Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036499 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036568 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxs84\" (UniqueName: \"kubernetes.io/projected/51f36c18-61cd-43d1-98a6-b569197c9382-kube-api-access-xxs84\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036637 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-combined-ca-bundle\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036672 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036692 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgh8h\" (UniqueName: \"kubernetes.io/projected/fe96ba85-9487-44de-87f3-e202c1a30fa6-kube-api-access-qgh8h\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036713 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data-custom\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036734 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data-custom\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036753 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036790 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-combined-ca-bundle\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036806 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-logs\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036839 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036861 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036877 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f36c18-61cd-43d1-98a6-b569197c9382-logs\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036893 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-config\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036910 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-svc\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.036943 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcrdg\" (UniqueName: \"kubernetes.io/projected/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-kube-api-access-dcrdg\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.048577 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-logs\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.048805 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-combined-ca-bundle\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.062256 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data-custom\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.062650 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f36c18-61cd-43d1-98a6-b569197c9382-logs\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.063314 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-combined-ca-bundle\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.071208 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data-custom\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.073103 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcrdg\" (UniqueName: \"kubernetes.io/projected/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-kube-api-access-dcrdg\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.089643 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data\") pod \"barbican-worker-7c8774d5b7-qj8g5\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.090838 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.092126 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxs84\" (UniqueName: \"kubernetes.io/projected/51f36c18-61cd-43d1-98a6-b569197c9382-kube-api-access-xxs84\") pod \"barbican-keystone-listener-5495b78bc8-wbf5t\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.138435 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgh8h\" (UniqueName: \"kubernetes.io/projected/fe96ba85-9487-44de-87f3-e202c1a30fa6-kube-api-access-qgh8h\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.138490 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.138568 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.138590 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-config\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.138607 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-svc\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.138660 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.139689 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.140773 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.141373 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.143195 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-svc\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.143680 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-config\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.158215 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6db9c4d6b-sjqbc"] Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.159950 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.162165 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.173615 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6db9c4d6b-sjqbc"] Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.174042 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.185207 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgh8h\" (UniqueName: \"kubernetes.io/projected/fe96ba85-9487-44de-87f3-e202c1a30fa6-kube-api-access-qgh8h\") pod \"dnsmasq-dns-688c87cc99-vb7rc\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.193780 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.201121 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="ceilometer-central-agent" containerID="cri-o://737e7948bdf87a15a50420e74839cfd300bc5557fa6b1e61dc0e5a1cf2a12956" gracePeriod=30 Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.202678 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vlnnh"] Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.202990 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.203015 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerStarted","Data":"9492f4bba0d7e41ce6ac2fbc75a6e4b6d846b88d5ac596da12a296d2a3b26e0b"} Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.203062 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="sg-core" containerID="cri-o://55b9ead6cbc3246ada2c749e825680008bf0b9eb8b6c9ba49075a691ca9d6c03" gracePeriod=30 Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.203142 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="proxy-httpd" containerID="cri-o://9492f4bba0d7e41ce6ac2fbc75a6e4b6d846b88d5ac596da12a296d2a3b26e0b" gracePeriod=30 Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.203202 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="ceilometer-notification-agent" containerID="cri-o://192a322439949335b4f9eade6df4da3ee83390e6d965ca42196d09c64b3d7944" gracePeriod=30 Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.277294 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a66c793a-fadc-42c2-9a01-1b0cb017f773-logs\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.277560 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data-custom\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.277638 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.277765 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjr7l\" (UniqueName: \"kubernetes.io/projected/a66c793a-fadc-42c2-9a01-1b0cb017f773-kube-api-access-zjr7l\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.277837 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-combined-ca-bundle\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.292085 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.847457092 podStartE2EDuration="43.292061312s" podCreationTimestamp="2025-10-07 08:34:34 +0000 UTC" firstStartedPulling="2025-10-07 08:34:36.083828883 +0000 UTC m=+1082.893143027" lastFinishedPulling="2025-10-07 08:35:16.528433103 +0000 UTC m=+1123.337747247" observedRunningTime="2025-10-07 08:35:17.274638733 +0000 UTC m=+1124.083952877" watchObservedRunningTime="2025-10-07 08:35:17.292061312 +0000 UTC m=+1124.101375446" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.315993 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.379898 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a66c793a-fadc-42c2-9a01-1b0cb017f773-logs\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.379989 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data-custom\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.380011 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.380065 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjr7l\" (UniqueName: \"kubernetes.io/projected/a66c793a-fadc-42c2-9a01-1b0cb017f773-kube-api-access-zjr7l\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.380112 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-combined-ca-bundle\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.380435 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a66c793a-fadc-42c2-9a01-1b0cb017f773-logs\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.386564 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-combined-ca-bundle\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.388864 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data-custom\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.399016 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.420727 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5899768569-tz2vj"] Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.424282 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjr7l\" (UniqueName: \"kubernetes.io/projected/a66c793a-fadc-42c2-9a01-1b0cb017f773-kube-api-access-zjr7l\") pod \"barbican-api-6db9c4d6b-sjqbc\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.536989 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.724316 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c8774d5b7-qj8g5"] Oct 07 08:35:17 crc kubenswrapper[5025]: W1007 08:35:17.726228 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a8356fc_f4bd_4853_a3f3_0d44ab20612b.slice/crio-ecf9c0eb66f8b326535013cd9b64e868308df29c17feb09d2ddbf3f6f14c7fdc WatchSource:0}: Error finding container ecf9c0eb66f8b326535013cd9b64e868308df29c17feb09d2ddbf3f6f14c7fdc: Status 404 returned error can't find the container with id ecf9c0eb66f8b326535013cd9b64e868308df29c17feb09d2ddbf3f6f14c7fdc Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.841318 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5495b78bc8-wbf5t"] Oct 07 08:35:17 crc kubenswrapper[5025]: W1007 08:35:17.896547 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51f36c18_61cd_43d1_98a6_b569197c9382.slice/crio-25dae93d3e9a16a849ae543e7e7ae79456846f66e20eb8989152f27dc41f9228 WatchSource:0}: Error finding container 25dae93d3e9a16a849ae543e7e7ae79456846f66e20eb8989152f27dc41f9228: Status 404 returned error can't find the container with id 25dae93d3e9a16a849ae543e7e7ae79456846f66e20eb8989152f27dc41f9228 Oct 07 08:35:17 crc kubenswrapper[5025]: W1007 08:35:17.952854 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe96ba85_9487_44de_87f3_e202c1a30fa6.slice/crio-701e5689bcc444cef57fef16888a655ed4dc9c9b617cd1bbd260536f5799e2c5 WatchSource:0}: Error finding container 701e5689bcc444cef57fef16888a655ed4dc9c9b617cd1bbd260536f5799e2c5: Status 404 returned error can't find the container with id 701e5689bcc444cef57fef16888a655ed4dc9c9b617cd1bbd260536f5799e2c5 Oct 07 08:35:17 crc kubenswrapper[5025]: I1007 08:35:17.966445 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vb7rc"] Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.100535 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6db9c4d6b-sjqbc"] Oct 07 08:35:18 crc kubenswrapper[5025]: W1007 08:35:18.116050 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda66c793a_fadc_42c2_9a01_1b0cb017f773.slice/crio-118fdc114dea4a6a3eff0a1f617a934c896d4cc814847f21c34c990c8bc46506 WatchSource:0}: Error finding container 118fdc114dea4a6a3eff0a1f617a934c896d4cc814847f21c34c990c8bc46506: Status 404 returned error can't find the container with id 118fdc114dea4a6a3eff0a1f617a934c896d4cc814847f21c34c990c8bc46506 Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.216340 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db9c4d6b-sjqbc" event={"ID":"a66c793a-fadc-42c2-9a01-1b0cb017f773","Type":"ContainerStarted","Data":"118fdc114dea4a6a3eff0a1f617a934c896d4cc814847f21c34c990c8bc46506"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.218561 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899768569-tz2vj" event={"ID":"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0","Type":"ContainerStarted","Data":"76af13f38bf57b12ea10ac6d1ec8e591e15b727a96e03aeaffeda063e26481de"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.218662 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899768569-tz2vj" event={"ID":"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0","Type":"ContainerStarted","Data":"d0cadcb14b122e4544f81e647b5313ffc159d240c42a8c28574e6aa18ad57dbf"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.218742 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899768569-tz2vj" event={"ID":"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0","Type":"ContainerStarted","Data":"4e3ca449f35b47e8fffe908f1a523367a196377df1605e518e8e83e7428304e3"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.218824 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.219769 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" event={"ID":"51f36c18-61cd-43d1-98a6-b569197c9382","Type":"ContainerStarted","Data":"25dae93d3e9a16a849ae543e7e7ae79456846f66e20eb8989152f27dc41f9228"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.222107 5025 generic.go:334] "Generic (PLEG): container finished" podID="3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" containerID="a4ba9c281726f5ac9dec286a5cfb425f4d86009c021f2007abe924c258a1a4f2" exitCode=0 Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.222242 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" event={"ID":"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700","Type":"ContainerDied","Data":"a4ba9c281726f5ac9dec286a5cfb425f4d86009c021f2007abe924c258a1a4f2"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.222268 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" event={"ID":"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700","Type":"ContainerStarted","Data":"56c00ca8840f7f4fd9c74feb94fa2d4a28b081d02fb75aeba787710fa7e89ca6"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.224681 5025 generic.go:334] "Generic (PLEG): container finished" podID="fe96ba85-9487-44de-87f3-e202c1a30fa6" containerID="caa08dee4e8bf4e7254f9f1c798ce5cb8f708c653e45d6f2fc4d545702b7e504" exitCode=0 Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.224765 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" event={"ID":"fe96ba85-9487-44de-87f3-e202c1a30fa6","Type":"ContainerDied","Data":"caa08dee4e8bf4e7254f9f1c798ce5cb8f708c653e45d6f2fc4d545702b7e504"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.224802 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" event={"ID":"fe96ba85-9487-44de-87f3-e202c1a30fa6","Type":"ContainerStarted","Data":"701e5689bcc444cef57fef16888a655ed4dc9c9b617cd1bbd260536f5799e2c5"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.232878 5025 generic.go:334] "Generic (PLEG): container finished" podID="c06c9321-437f-4b6a-b205-b56758517e75" containerID="9492f4bba0d7e41ce6ac2fbc75a6e4b6d846b88d5ac596da12a296d2a3b26e0b" exitCode=0 Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.232927 5025 generic.go:334] "Generic (PLEG): container finished" podID="c06c9321-437f-4b6a-b205-b56758517e75" containerID="55b9ead6cbc3246ada2c749e825680008bf0b9eb8b6c9ba49075a691ca9d6c03" exitCode=2 Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.232937 5025 generic.go:334] "Generic (PLEG): container finished" podID="c06c9321-437f-4b6a-b205-b56758517e75" containerID="737e7948bdf87a15a50420e74839cfd300bc5557fa6b1e61dc0e5a1cf2a12956" exitCode=0 Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.232932 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerDied","Data":"9492f4bba0d7e41ce6ac2fbc75a6e4b6d846b88d5ac596da12a296d2a3b26e0b"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.232985 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerDied","Data":"55b9ead6cbc3246ada2c749e825680008bf0b9eb8b6c9ba49075a691ca9d6c03"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.233000 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerDied","Data":"737e7948bdf87a15a50420e74839cfd300bc5557fa6b1e61dc0e5a1cf2a12956"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.234460 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" event={"ID":"6a8356fc-f4bd-4853-a3f3-0d44ab20612b","Type":"ContainerStarted","Data":"ecf9c0eb66f8b326535013cd9b64e868308df29c17feb09d2ddbf3f6f14c7fdc"} Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.242359 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5899768569-tz2vj" podStartSLOduration=5.24234235 podStartE2EDuration="5.24234235s" podCreationTimestamp="2025-10-07 08:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:18.237157267 +0000 UTC m=+1125.046471411" watchObservedRunningTime="2025-10-07 08:35:18.24234235 +0000 UTC m=+1125.051656494" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.378195 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b6897fb44-5d68h"] Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.654049 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.718890 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-sb\") pod \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.719070 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-config\") pod \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.719188 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8228t\" (UniqueName: \"kubernetes.io/projected/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-kube-api-access-8228t\") pod \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.719250 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-svc\") pod \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.719309 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-swift-storage-0\") pod \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.719341 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-nb\") pod \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\" (UID: \"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700\") " Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.728115 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-kube-api-access-8228t" (OuterVolumeSpecName: "kube-api-access-8228t") pod "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" (UID: "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700"). InnerVolumeSpecName "kube-api-access-8228t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.757365 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-config" (OuterVolumeSpecName: "config") pod "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" (UID: "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.771157 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" (UID: "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.778608 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" (UID: "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.785796 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" (UID: "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.798223 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" (UID: "3cb6f85b-3d51-481c-a9c8-dfd8e00c0700"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.821896 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.821927 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8228t\" (UniqueName: \"kubernetes.io/projected/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-kube-api-access-8228t\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.821939 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.821947 5025 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.821955 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:18 crc kubenswrapper[5025]: I1007 08:35:18.821964 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.249097 5025 generic.go:334] "Generic (PLEG): container finished" podID="c06c9321-437f-4b6a-b205-b56758517e75" containerID="192a322439949335b4f9eade6df4da3ee83390e6d965ca42196d09c64b3d7944" exitCode=0 Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.249285 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerDied","Data":"192a322439949335b4f9eade6df4da3ee83390e6d965ca42196d09c64b3d7944"} Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.252342 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db9c4d6b-sjqbc" event={"ID":"a66c793a-fadc-42c2-9a01-1b0cb017f773","Type":"ContainerStarted","Data":"b6486aeb3f423dba47093e6e15d4101ef4367f2e384b1ea30754b55916ffe29d"} Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.252388 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db9c4d6b-sjqbc" event={"ID":"a66c793a-fadc-42c2-9a01-1b0cb017f773","Type":"ContainerStarted","Data":"b0fbb09cea78008273b57dcb42126acb3d363f88d2a67d1525a4e5d1ee10b874"} Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.252512 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.254057 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b6897fb44-5d68h" event={"ID":"c0b57ea4-f1b6-47f3-adef-97c26b470477","Type":"ContainerStarted","Data":"51088a15c259b4db87cd12b21a346dd16dc9bec3d5bf784b38768e7c561043ba"} Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.254088 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b6897fb44-5d68h" event={"ID":"c0b57ea4-f1b6-47f3-adef-97c26b470477","Type":"ContainerStarted","Data":"708768a2c94adfa8c4edde2831f01fcbd06ad80d55be39ff0b7183840ed0d8a8"} Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.254100 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b6897fb44-5d68h" event={"ID":"c0b57ea4-f1b6-47f3-adef-97c26b470477","Type":"ContainerStarted","Data":"00cd8b50dff96c95e0ac1606c3434063483969f0c13cf37e267b3f00f6219c63"} Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.254954 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.256695 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" event={"ID":"3cb6f85b-3d51-481c-a9c8-dfd8e00c0700","Type":"ContainerDied","Data":"56c00ca8840f7f4fd9c74feb94fa2d4a28b081d02fb75aeba787710fa7e89ca6"} Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.256730 5025 scope.go:117] "RemoveContainer" containerID="a4ba9c281726f5ac9dec286a5cfb425f4d86009c021f2007abe924c258a1a4f2" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.256827 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-vlnnh" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.263272 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" event={"ID":"fe96ba85-9487-44de-87f3-e202c1a30fa6","Type":"ContainerStarted","Data":"90e3d502a44efb5858a41705ed907f08bca10191ff4aaedc314d9f26c985b201"} Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.263582 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.277808 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6db9c4d6b-sjqbc" podStartSLOduration=2.277790293 podStartE2EDuration="2.277790293s" podCreationTimestamp="2025-10-07 08:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:19.277021949 +0000 UTC m=+1126.086336093" watchObservedRunningTime="2025-10-07 08:35:19.277790293 +0000 UTC m=+1126.087104427" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.301221 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" podStartSLOduration=3.301204191 podStartE2EDuration="3.301204191s" podCreationTimestamp="2025-10-07 08:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:19.295948086 +0000 UTC m=+1126.105262230" watchObservedRunningTime="2025-10-07 08:35:19.301204191 +0000 UTC m=+1126.110518335" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.328466 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b6897fb44-5d68h" podStartSLOduration=8.32844642 podStartE2EDuration="8.32844642s" podCreationTimestamp="2025-10-07 08:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:19.32559984 +0000 UTC m=+1126.134913984" watchObservedRunningTime="2025-10-07 08:35:19.32844642 +0000 UTC m=+1126.137760564" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.381658 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vlnnh"] Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.388591 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vlnnh"] Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.839694 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56bb8cc8-59x6x"] Oct 07 08:35:19 crc kubenswrapper[5025]: E1007 08:35:19.840369 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" containerName="init" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.840382 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" containerName="init" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.843704 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" containerName="init" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.844846 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.853951 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.854309 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56bb8cc8-59x6x"] Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.855880 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.926803 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb6f85b-3d51-481c-a9c8-dfd8e00c0700" path="/var/lib/kubelet/pods/3cb6f85b-3d51-481c-a9c8-dfd8e00c0700/volumes" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.947749 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-combined-ca-bundle\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.948014 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data-custom\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.948043 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f66554ef-2742-4c2b-a6e4-58b8377f03fa-logs\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.948093 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.948271 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmbkg\" (UniqueName: \"kubernetes.io/projected/f66554ef-2742-4c2b-a6e4-58b8377f03fa-kube-api-access-wmbkg\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.948305 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-public-tls-certs\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:19 crc kubenswrapper[5025]: I1007 08:35:19.948402 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-internal-tls-certs\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.050265 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-combined-ca-bundle\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.050332 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data-custom\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.050353 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f66554ef-2742-4c2b-a6e4-58b8377f03fa-logs\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.050386 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.050452 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmbkg\" (UniqueName: \"kubernetes.io/projected/f66554ef-2742-4c2b-a6e4-58b8377f03fa-kube-api-access-wmbkg\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.050476 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-public-tls-certs\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.050576 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-internal-tls-certs\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.050864 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f66554ef-2742-4c2b-a6e4-58b8377f03fa-logs\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.056712 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data-custom\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.057327 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-internal-tls-certs\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.059051 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-public-tls-certs\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.059405 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.065136 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-combined-ca-bundle\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.068533 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmbkg\" (UniqueName: \"kubernetes.io/projected/f66554ef-2742-4c2b-a6e4-58b8377f03fa-kube-api-access-wmbkg\") pod \"barbican-api-56bb8cc8-59x6x\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.170064 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.276180 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:20 crc kubenswrapper[5025]: E1007 08:35:20.656014 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13598166_30eb_43b2_8a13_2e2ca72f58e9.slice\": RecentStats: unable to find data in memory cache]" Oct 07 08:35:20 crc kubenswrapper[5025]: I1007 08:35:20.997783 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.070010 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-scripts\") pod \"c06c9321-437f-4b6a-b205-b56758517e75\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.070094 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-combined-ca-bundle\") pod \"c06c9321-437f-4b6a-b205-b56758517e75\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.070256 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q67tf\" (UniqueName: \"kubernetes.io/projected/c06c9321-437f-4b6a-b205-b56758517e75-kube-api-access-q67tf\") pod \"c06c9321-437f-4b6a-b205-b56758517e75\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.070327 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-log-httpd\") pod \"c06c9321-437f-4b6a-b205-b56758517e75\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.070356 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-run-httpd\") pod \"c06c9321-437f-4b6a-b205-b56758517e75\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.070406 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-config-data\") pod \"c06c9321-437f-4b6a-b205-b56758517e75\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.070442 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-sg-core-conf-yaml\") pod \"c06c9321-437f-4b6a-b205-b56758517e75\" (UID: \"c06c9321-437f-4b6a-b205-b56758517e75\") " Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.070901 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c06c9321-437f-4b6a-b205-b56758517e75" (UID: "c06c9321-437f-4b6a-b205-b56758517e75"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.071097 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.071231 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c06c9321-437f-4b6a-b205-b56758517e75" (UID: "c06c9321-437f-4b6a-b205-b56758517e75"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.079939 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c06c9321-437f-4b6a-b205-b56758517e75-kube-api-access-q67tf" (OuterVolumeSpecName: "kube-api-access-q67tf") pod "c06c9321-437f-4b6a-b205-b56758517e75" (UID: "c06c9321-437f-4b6a-b205-b56758517e75"). InnerVolumeSpecName "kube-api-access-q67tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.080852 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-scripts" (OuterVolumeSpecName: "scripts") pod "c06c9321-437f-4b6a-b205-b56758517e75" (UID: "c06c9321-437f-4b6a-b205-b56758517e75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.107642 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c06c9321-437f-4b6a-b205-b56758517e75" (UID: "c06c9321-437f-4b6a-b205-b56758517e75"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.175237 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.175274 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q67tf\" (UniqueName: \"kubernetes.io/projected/c06c9321-437f-4b6a-b205-b56758517e75-kube-api-access-q67tf\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.175288 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c06c9321-437f-4b6a-b205-b56758517e75-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.175298 5025 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.190931 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c06c9321-437f-4b6a-b205-b56758517e75" (UID: "c06c9321-437f-4b6a-b205-b56758517e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.228831 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56bb8cc8-59x6x"] Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.232480 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-config-data" (OuterVolumeSpecName: "config-data") pod "c06c9321-437f-4b6a-b205-b56758517e75" (UID: "c06c9321-437f-4b6a-b205-b56758517e75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.280072 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.280097 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c06c9321-437f-4b6a-b205-b56758517e75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.313104 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bb8cc8-59x6x" event={"ID":"f66554ef-2742-4c2b-a6e4-58b8377f03fa","Type":"ContainerStarted","Data":"cbb201e2bee3dd86717d8af4d515e9a36656d5e464720ad4bcf1cea8dec2987e"} Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.315653 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" event={"ID":"51f36c18-61cd-43d1-98a6-b569197c9382","Type":"ContainerStarted","Data":"b7e10267acefb906d0890b494854e97353644178df770bee745cfcda0052c6b4"} Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.321681 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c06c9321-437f-4b6a-b205-b56758517e75","Type":"ContainerDied","Data":"f1142fd83b1ecfcf9e7b7332fc1fd031487b8b56449a7df111db8d9351d140d0"} Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.321744 5025 scope.go:117] "RemoveContainer" containerID="9492f4bba0d7e41ce6ac2fbc75a6e4b6d846b88d5ac596da12a296d2a3b26e0b" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.321870 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.328900 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" event={"ID":"6a8356fc-f4bd-4853-a3f3-0d44ab20612b","Type":"ContainerStarted","Data":"878aa4d7a336162ac22f34d19599bf58c73742f602e586a96453e05637fc3934"} Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.364703 5025 scope.go:117] "RemoveContainer" containerID="55b9ead6cbc3246ada2c749e825680008bf0b9eb8b6c9ba49075a691ca9d6c03" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.365979 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.377451 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.392994 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.393312 5025 scope.go:117] "RemoveContainer" containerID="192a322439949335b4f9eade6df4da3ee83390e6d965ca42196d09c64b3d7944" Oct 07 08:35:21 crc kubenswrapper[5025]: E1007 08:35:21.393365 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="ceilometer-central-agent" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.393378 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="ceilometer-central-agent" Oct 07 08:35:21 crc kubenswrapper[5025]: E1007 08:35:21.393393 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="proxy-httpd" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.393399 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="proxy-httpd" Oct 07 08:35:21 crc kubenswrapper[5025]: E1007 08:35:21.393404 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="ceilometer-notification-agent" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.393411 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="ceilometer-notification-agent" Oct 07 08:35:21 crc kubenswrapper[5025]: E1007 08:35:21.393429 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="sg-core" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.393436 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="sg-core" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.393634 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="proxy-httpd" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.393654 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="ceilometer-notification-agent" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.393664 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="ceilometer-central-agent" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.393679 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06c9321-437f-4b6a-b205-b56758517e75" containerName="sg-core" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.395076 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.402459 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.404852 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.415988 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.467067 5025 scope.go:117] "RemoveContainer" containerID="737e7948bdf87a15a50420e74839cfd300bc5557fa6b1e61dc0e5a1cf2a12956" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.507080 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.507132 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-run-httpd\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.507161 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjjbm\" (UniqueName: \"kubernetes.io/projected/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-kube-api-access-mjjbm\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.507192 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.507213 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-scripts\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.507298 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-config-data\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.507389 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-log-httpd\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.608754 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-config-data\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.608858 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-log-httpd\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.608913 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.608933 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-run-httpd\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.608956 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjjbm\" (UniqueName: \"kubernetes.io/projected/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-kube-api-access-mjjbm\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.608986 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.609002 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-scripts\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.612190 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-run-httpd\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.612424 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-log-httpd\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.615455 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.615827 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-config-data\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.618164 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-scripts\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.620126 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.633330 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjjbm\" (UniqueName: \"kubernetes.io/projected/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-kube-api-access-mjjbm\") pod \"ceilometer-0\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.746071 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:21 crc kubenswrapper[5025]: I1007 08:35:21.947058 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06c9321-437f-4b6a-b205-b56758517e75" path="/var/lib/kubelet/pods/c06c9321-437f-4b6a-b205-b56758517e75/volumes" Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.288812 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:22 crc kubenswrapper[5025]: W1007 08:35:22.296733 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ed584b3_29d6_4ad5_8ac1_a870c230d19f.slice/crio-f7191dc992a201c6b575d8f99d2f5cfa7e61cf27cad36128f74aebc9d1215095 WatchSource:0}: Error finding container f7191dc992a201c6b575d8f99d2f5cfa7e61cf27cad36128f74aebc9d1215095: Status 404 returned error can't find the container with id f7191dc992a201c6b575d8f99d2f5cfa7e61cf27cad36128f74aebc9d1215095 Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.348086 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2xx6h" event={"ID":"06e1a5d0-ce28-4495-937a-1aaccbcbb644","Type":"ContainerStarted","Data":"fcc59fa4ee6efb3343e06efb0358177c91fcb632bb8cbd5cc15a1c3116b88117"} Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.355850 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" event={"ID":"6a8356fc-f4bd-4853-a3f3-0d44ab20612b","Type":"ContainerStarted","Data":"1d466d120650f5aa29db6aa35c48e75103701a48199f9d6ad44171efa028e65a"} Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.364865 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bb8cc8-59x6x" event={"ID":"f66554ef-2742-4c2b-a6e4-58b8377f03fa","Type":"ContainerStarted","Data":"7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4"} Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.364910 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.364920 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bb8cc8-59x6x" event={"ID":"f66554ef-2742-4c2b-a6e4-58b8377f03fa","Type":"ContainerStarted","Data":"6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa"} Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.364956 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.374770 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerStarted","Data":"f7191dc992a201c6b575d8f99d2f5cfa7e61cf27cad36128f74aebc9d1215095"} Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.377123 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" event={"ID":"51f36c18-61cd-43d1-98a6-b569197c9382","Type":"ContainerStarted","Data":"541d0073c514c96dd102a8451f807366e43a748b3da042e4f4deb995f526e4db"} Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.383632 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-2xx6h" podStartSLOduration=3.249192653 podStartE2EDuration="37.383614686s" podCreationTimestamp="2025-10-07 08:34:45 +0000 UTC" firstStartedPulling="2025-10-07 08:34:46.680534671 +0000 UTC m=+1093.489848815" lastFinishedPulling="2025-10-07 08:35:20.814956704 +0000 UTC m=+1127.624270848" observedRunningTime="2025-10-07 08:35:22.365457844 +0000 UTC m=+1129.174771988" watchObservedRunningTime="2025-10-07 08:35:22.383614686 +0000 UTC m=+1129.192928830" Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.390209 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56bb8cc8-59x6x" podStartSLOduration=3.390193533 podStartE2EDuration="3.390193533s" podCreationTimestamp="2025-10-07 08:35:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:22.387897431 +0000 UTC m=+1129.197211565" watchObservedRunningTime="2025-10-07 08:35:22.390193533 +0000 UTC m=+1129.199507677" Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.424869 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" podStartSLOduration=3.393377536 podStartE2EDuration="6.424851535s" podCreationTimestamp="2025-10-07 08:35:16 +0000 UTC" firstStartedPulling="2025-10-07 08:35:17.738213508 +0000 UTC m=+1124.547527652" lastFinishedPulling="2025-10-07 08:35:20.769687507 +0000 UTC m=+1127.579001651" observedRunningTime="2025-10-07 08:35:22.4075614 +0000 UTC m=+1129.216875544" watchObservedRunningTime="2025-10-07 08:35:22.424851535 +0000 UTC m=+1129.234165669" Oct 07 08:35:22 crc kubenswrapper[5025]: I1007 08:35:22.445693 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" podStartSLOduration=3.5801831809999998 podStartE2EDuration="6.445671991s" podCreationTimestamp="2025-10-07 08:35:16 +0000 UTC" firstStartedPulling="2025-10-07 08:35:17.904964792 +0000 UTC m=+1124.714278936" lastFinishedPulling="2025-10-07 08:35:20.770453592 +0000 UTC m=+1127.579767746" observedRunningTime="2025-10-07 08:35:22.42214362 +0000 UTC m=+1129.231457764" watchObservedRunningTime="2025-10-07 08:35:22.445671991 +0000 UTC m=+1129.254986135" Oct 07 08:35:23 crc kubenswrapper[5025]: I1007 08:35:23.260486 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:35:23 crc kubenswrapper[5025]: I1007 08:35:23.345860 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:35:23 crc kubenswrapper[5025]: I1007 08:35:23.403021 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerStarted","Data":"675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0"} Oct 07 08:35:24 crc kubenswrapper[5025]: I1007 08:35:24.416198 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerStarted","Data":"4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8"} Oct 07 08:35:25 crc kubenswrapper[5025]: I1007 08:35:25.430307 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerStarted","Data":"a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5"} Oct 07 08:35:26 crc kubenswrapper[5025]: I1007 08:35:26.465283 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerStarted","Data":"bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a"} Oct 07 08:35:26 crc kubenswrapper[5025]: I1007 08:35:26.465840 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 08:35:26 crc kubenswrapper[5025]: I1007 08:35:26.498214 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8513145020000001 podStartE2EDuration="5.49819024s" podCreationTimestamp="2025-10-07 08:35:21 +0000 UTC" firstStartedPulling="2025-10-07 08:35:22.298520375 +0000 UTC m=+1129.107834519" lastFinishedPulling="2025-10-07 08:35:25.945396113 +0000 UTC m=+1132.754710257" observedRunningTime="2025-10-07 08:35:26.485710936 +0000 UTC m=+1133.295025080" watchObservedRunningTime="2025-10-07 08:35:26.49819024 +0000 UTC m=+1133.307504394" Oct 07 08:35:26 crc kubenswrapper[5025]: I1007 08:35:26.822753 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:35:27 crc kubenswrapper[5025]: I1007 08:35:27.319673 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:27 crc kubenswrapper[5025]: I1007 08:35:27.459445 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fvkl9"] Oct 07 08:35:27 crc kubenswrapper[5025]: I1007 08:35:27.460239 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" podUID="cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" containerName="dnsmasq-dns" containerID="cri-o://f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770" gracePeriod=10 Oct 07 08:35:27 crc kubenswrapper[5025]: I1007 08:35:27.483919 5025 generic.go:334] "Generic (PLEG): container finished" podID="06e1a5d0-ce28-4495-937a-1aaccbcbb644" containerID="fcc59fa4ee6efb3343e06efb0358177c91fcb632bb8cbd5cc15a1c3116b88117" exitCode=0 Oct 07 08:35:27 crc kubenswrapper[5025]: I1007 08:35:27.483994 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2xx6h" event={"ID":"06e1a5d0-ce28-4495-937a-1aaccbcbb644","Type":"ContainerDied","Data":"fcc59fa4ee6efb3343e06efb0358177c91fcb632bb8cbd5cc15a1c3116b88117"} Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.011943 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.144178 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-nb\") pod \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.144247 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7t5q\" (UniqueName: \"kubernetes.io/projected/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-kube-api-access-w7t5q\") pod \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.144388 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-config\") pod \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.144468 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-svc\") pod \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.144570 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-swift-storage-0\") pod \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.144616 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-sb\") pod \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.160889 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-kube-api-access-w7t5q" (OuterVolumeSpecName: "kube-api-access-w7t5q") pod "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" (UID: "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414"). InnerVolumeSpecName "kube-api-access-w7t5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.222488 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" (UID: "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.228397 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" (UID: "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.232503 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" (UID: "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.243444 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" (UID: "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.246049 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-config" (OuterVolumeSpecName: "config") pod "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" (UID: "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.246266 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-config\") pod \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\" (UID: \"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414\") " Oct 07 08:35:28 crc kubenswrapper[5025]: W1007 08:35:28.246379 5025 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414/volumes/kubernetes.io~configmap/config Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.246394 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-config" (OuterVolumeSpecName: "config") pod "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" (UID: "cc067ffb-1fa7-45b3-8d1f-5eb0a7586414"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.246701 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.246734 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7t5q\" (UniqueName: \"kubernetes.io/projected/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-kube-api-access-w7t5q\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.246746 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.246755 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.246768 5025 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.246777 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.494571 5025 generic.go:334] "Generic (PLEG): container finished" podID="cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" containerID="f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770" exitCode=0 Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.494649 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.494676 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" event={"ID":"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414","Type":"ContainerDied","Data":"f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770"} Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.494738 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-fvkl9" event={"ID":"cc067ffb-1fa7-45b3-8d1f-5eb0a7586414","Type":"ContainerDied","Data":"8b67e72873f529b5099483bb26c162594f13020ffee9698534709cdabd4815c1"} Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.494761 5025 scope.go:117] "RemoveContainer" containerID="f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.520839 5025 scope.go:117] "RemoveContainer" containerID="e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.555818 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fvkl9"] Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.570738 5025 scope.go:117] "RemoveContainer" containerID="f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770" Oct 07 08:35:28 crc kubenswrapper[5025]: E1007 08:35:28.571201 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770\": container with ID starting with f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770 not found: ID does not exist" containerID="f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.571244 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770"} err="failed to get container status \"f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770\": rpc error: code = NotFound desc = could not find container \"f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770\": container with ID starting with f15e0a9c1a80e0f9774c23b97eb6a8292f54b9a0ab0ee42c5cab1259d3fb6770 not found: ID does not exist" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.571271 5025 scope.go:117] "RemoveContainer" containerID="e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e" Oct 07 08:35:28 crc kubenswrapper[5025]: E1007 08:35:28.574699 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e\": container with ID starting with e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e not found: ID does not exist" containerID="e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.574753 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e"} err="failed to get container status \"e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e\": rpc error: code = NotFound desc = could not find container \"e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e\": container with ID starting with e0f57fc43d1d3acf20ef1de51e86517593f6fd067ebff97c73b8febf6947e38e not found: ID does not exist" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.575216 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-fvkl9"] Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.830787 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.962306 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06e1a5d0-ce28-4495-937a-1aaccbcbb644-etc-machine-id\") pod \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.962455 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v78lh\" (UniqueName: \"kubernetes.io/projected/06e1a5d0-ce28-4495-937a-1aaccbcbb644-kube-api-access-v78lh\") pod \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.962496 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06e1a5d0-ce28-4495-937a-1aaccbcbb644-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "06e1a5d0-ce28-4495-937a-1aaccbcbb644" (UID: "06e1a5d0-ce28-4495-937a-1aaccbcbb644"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.962610 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-scripts\") pod \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.962700 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-combined-ca-bundle\") pod \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.962761 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-db-sync-config-data\") pod \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.962814 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-config-data\") pod \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\" (UID: \"06e1a5d0-ce28-4495-937a-1aaccbcbb644\") " Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.963637 5025 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06e1a5d0-ce28-4495-937a-1aaccbcbb644-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.970763 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-scripts" (OuterVolumeSpecName: "scripts") pod "06e1a5d0-ce28-4495-937a-1aaccbcbb644" (UID: "06e1a5d0-ce28-4495-937a-1aaccbcbb644"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.972742 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e1a5d0-ce28-4495-937a-1aaccbcbb644-kube-api-access-v78lh" (OuterVolumeSpecName: "kube-api-access-v78lh") pod "06e1a5d0-ce28-4495-937a-1aaccbcbb644" (UID: "06e1a5d0-ce28-4495-937a-1aaccbcbb644"). InnerVolumeSpecName "kube-api-access-v78lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:28 crc kubenswrapper[5025]: I1007 08:35:28.991745 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "06e1a5d0-ce28-4495-937a-1aaccbcbb644" (UID: "06e1a5d0-ce28-4495-937a-1aaccbcbb644"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.002676 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06e1a5d0-ce28-4495-937a-1aaccbcbb644" (UID: "06e1a5d0-ce28-4495-937a-1aaccbcbb644"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.052441 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-config-data" (OuterVolumeSpecName: "config-data") pod "06e1a5d0-ce28-4495-937a-1aaccbcbb644" (UID: "06e1a5d0-ce28-4495-937a-1aaccbcbb644"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.065916 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.065957 5025 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.065971 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.065981 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v78lh\" (UniqueName: \"kubernetes.io/projected/06e1a5d0-ce28-4495-937a-1aaccbcbb644-kube-api-access-v78lh\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.065992 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06e1a5d0-ce28-4495-937a-1aaccbcbb644-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.169044 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.301929 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.510313 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2xx6h" event={"ID":"06e1a5d0-ce28-4495-937a-1aaccbcbb644","Type":"ContainerDied","Data":"2f515b5e127983cd6e83a60bd7de67fd525b65f4f903dfd2529176c83e5a65d1"} Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.510592 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f515b5e127983cd6e83a60bd7de67fd525b65f4f903dfd2529176c83e5a65d1" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.510356 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2xx6h" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.764530 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 08:35:29 crc kubenswrapper[5025]: E1007 08:35:29.764985 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e1a5d0-ce28-4495-937a-1aaccbcbb644" containerName="cinder-db-sync" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.765007 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e1a5d0-ce28-4495-937a-1aaccbcbb644" containerName="cinder-db-sync" Oct 07 08:35:29 crc kubenswrapper[5025]: E1007 08:35:29.765055 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" containerName="init" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.765063 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" containerName="init" Oct 07 08:35:29 crc kubenswrapper[5025]: E1007 08:35:29.765076 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" containerName="dnsmasq-dns" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.765085 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" containerName="dnsmasq-dns" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.765281 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" containerName="dnsmasq-dns" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.765307 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e1a5d0-ce28-4495-937a-1aaccbcbb644" containerName="cinder-db-sync" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.766074 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.768150 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.768390 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rcscj" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.774657 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.783247 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.785349 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.789010 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gbght" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.789209 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.791514 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.794102 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.794303 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.825104 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886273 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886321 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886394 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92870d9c-dcf8-45ad-af2b-9e827d319b8d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886426 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886446 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-scripts\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886467 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886491 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886519 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-288p9\" (UniqueName: \"kubernetes.io/projected/92870d9c-dcf8-45ad-af2b-9e827d319b8d-kube-api-access-288p9\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886556 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.886585 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf5lv\" (UniqueName: \"kubernetes.io/projected/d13fcf27-6664-430e-b9ac-81ff65769a0c-kube-api-access-xf5lv\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.892652 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-s92q7"] Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.901232 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.911052 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-s92q7"] Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.959651 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc067ffb-1fa7-45b3-8d1f-5eb0a7586414" path="/var/lib/kubelet/pods/cc067ffb-1fa7-45b3-8d1f-5eb0a7586414/volumes" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996256 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf5lv\" (UniqueName: \"kubernetes.io/projected/d13fcf27-6664-430e-b9ac-81ff65769a0c-kube-api-access-xf5lv\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996297 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv2q4\" (UniqueName: \"kubernetes.io/projected/9263b87e-395f-4a9f-b819-846f1378bd73-kube-api-access-rv2q4\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996376 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996402 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996463 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-config\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996508 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996586 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996609 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996680 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92870d9c-dcf8-45ad-af2b-9e827d319b8d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996746 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996767 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-scripts\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996793 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996822 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996892 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-288p9\" (UniqueName: \"kubernetes.io/projected/92870d9c-dcf8-45ad-af2b-9e827d319b8d-kube-api-access-288p9\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996919 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:29 crc kubenswrapper[5025]: I1007 08:35:29.996944 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.000168 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92870d9c-dcf8-45ad-af2b-9e827d319b8d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.003204 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.003833 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.004021 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.009052 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.010928 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-scripts\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.013891 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.020669 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf5lv\" (UniqueName: \"kubernetes.io/projected/d13fcf27-6664-430e-b9ac-81ff65769a0c-kube-api-access-xf5lv\") pod \"openstackclient\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " pod="openstack/openstackclient" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.026239 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-288p9\" (UniqueName: \"kubernetes.io/projected/92870d9c-dcf8-45ad-af2b-9e827d319b8d-kube-api-access-288p9\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.027863 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data\") pod \"cinder-scheduler-0\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.032357 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.033800 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.039907 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.051182 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100320 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-scripts\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100377 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100415 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100435 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv2q4\" (UniqueName: \"kubernetes.io/projected/9263b87e-395f-4a9f-b819-846f1378bd73-kube-api-access-rv2q4\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100462 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data-custom\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100487 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkrc4\" (UniqueName: \"kubernetes.io/projected/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-kube-api-access-mkrc4\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100507 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-config\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100531 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100567 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-logs\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100592 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100611 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100633 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.100668 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.101790 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.102399 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.102497 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.102625 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-config\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.103091 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.113761 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.125598 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv2q4\" (UniqueName: \"kubernetes.io/projected/9263b87e-395f-4a9f-b819-846f1378bd73-kube-api-access-rv2q4\") pod \"dnsmasq-dns-6bb4fc677f-s92q7\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.142069 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.204449 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.205411 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data-custom\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.205441 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkrc4\" (UniqueName: \"kubernetes.io/projected/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-kube-api-access-mkrc4\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.205479 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-logs\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.205511 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.207028 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.207145 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-scripts\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.205349 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.208809 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-logs\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.220248 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.223176 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-scripts\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.224433 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.225657 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data-custom\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.241047 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkrc4\" (UniqueName: \"kubernetes.io/projected/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-kube-api-access-mkrc4\") pod \"cinder-api-0\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.257167 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.515481 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 08:35:30 crc kubenswrapper[5025]: I1007 08:35:30.704668 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 08:35:31 crc kubenswrapper[5025]: I1007 08:35:31.000824 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:35:31 crc kubenswrapper[5025]: W1007 08:35:31.020753 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92870d9c_dcf8_45ad_af2b_9e827d319b8d.slice/crio-3bec688b31cbd159c7dc06b969ad4f90f42ea5657ea7637f5cc7f3f535028d68 WatchSource:0}: Error finding container 3bec688b31cbd159c7dc06b969ad4f90f42ea5657ea7637f5cc7f3f535028d68: Status 404 returned error can't find the container with id 3bec688b31cbd159c7dc06b969ad4f90f42ea5657ea7637f5cc7f3f535028d68 Oct 07 08:35:31 crc kubenswrapper[5025]: E1007 08:35:31.102276 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13598166_30eb_43b2_8a13_2e2ca72f58e9.slice\": RecentStats: unable to find data in memory cache]" Oct 07 08:35:31 crc kubenswrapper[5025]: I1007 08:35:31.147749 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-s92q7"] Oct 07 08:35:31 crc kubenswrapper[5025]: I1007 08:35:31.219156 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:35:31 crc kubenswrapper[5025]: I1007 08:35:31.573794 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" event={"ID":"9263b87e-395f-4a9f-b819-846f1378bd73","Type":"ContainerStarted","Data":"a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e"} Oct 07 08:35:31 crc kubenswrapper[5025]: I1007 08:35:31.573860 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" event={"ID":"9263b87e-395f-4a9f-b819-846f1378bd73","Type":"ContainerStarted","Data":"e654ea0b6a9e2505438ed37de230619e5bd152d84c17d252c622c186a85d6166"} Oct 07 08:35:31 crc kubenswrapper[5025]: I1007 08:35:31.575487 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea5e2f4b-727b-494a-9cdb-97c0960e3e20","Type":"ContainerStarted","Data":"1ef261314215b14f2f1dbe512b4cbf09d44b11b3adff97d312a51ef499c15d27"} Oct 07 08:35:31 crc kubenswrapper[5025]: I1007 08:35:31.582492 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"92870d9c-dcf8-45ad-af2b-9e827d319b8d","Type":"ContainerStarted","Data":"3bec688b31cbd159c7dc06b969ad4f90f42ea5657ea7637f5cc7f3f535028d68"} Oct 07 08:35:31 crc kubenswrapper[5025]: I1007 08:35:31.583701 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d13fcf27-6664-430e-b9ac-81ff65769a0c","Type":"ContainerStarted","Data":"313198ba1022ac3b30b333e91f7d17eadb47a726ef99c32d907d6214f3d77c98"} Oct 07 08:35:32 crc kubenswrapper[5025]: I1007 08:35:32.512025 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:35:32 crc kubenswrapper[5025]: I1007 08:35:32.553673 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:32 crc kubenswrapper[5025]: I1007 08:35:32.587768 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:35:32 crc kubenswrapper[5025]: I1007 08:35:32.640559 5025 generic.go:334] "Generic (PLEG): container finished" podID="9263b87e-395f-4a9f-b819-846f1378bd73" containerID="a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e" exitCode=0 Oct 07 08:35:32 crc kubenswrapper[5025]: I1007 08:35:32.640656 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" event={"ID":"9263b87e-395f-4a9f-b819-846f1378bd73","Type":"ContainerDied","Data":"a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e"} Oct 07 08:35:32 crc kubenswrapper[5025]: I1007 08:35:32.652086 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea5e2f4b-727b-494a-9cdb-97c0960e3e20","Type":"ContainerStarted","Data":"76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb"} Oct 07 08:35:32 crc kubenswrapper[5025]: I1007 08:35:32.684201 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6db9c4d6b-sjqbc"] Oct 07 08:35:32 crc kubenswrapper[5025]: I1007 08:35:32.684391 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6db9c4d6b-sjqbc" podUID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerName="barbican-api-log" containerID="cri-o://b0fbb09cea78008273b57dcb42126acb3d363f88d2a67d1525a4e5d1ee10b874" gracePeriod=30 Oct 07 08:35:32 crc kubenswrapper[5025]: I1007 08:35:32.684518 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6db9c4d6b-sjqbc" podUID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerName="barbican-api" containerID="cri-o://b6486aeb3f423dba47093e6e15d4101ef4367f2e384b1ea30754b55916ffe29d" gracePeriod=30 Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.678896 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" event={"ID":"9263b87e-395f-4a9f-b819-846f1378bd73","Type":"ContainerStarted","Data":"cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32"} Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.680534 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.698000 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea5e2f4b-727b-494a-9cdb-97c0960e3e20","Type":"ContainerStarted","Data":"8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1"} Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.698334 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerName="cinder-api-log" containerID="cri-o://76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb" gracePeriod=30 Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.698493 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.698656 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerName="cinder-api" containerID="cri-o://8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1" gracePeriod=30 Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.707185 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" podStartSLOduration=4.707170814 podStartE2EDuration="4.707170814s" podCreationTimestamp="2025-10-07 08:35:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:33.706956157 +0000 UTC m=+1140.516270301" watchObservedRunningTime="2025-10-07 08:35:33.707170814 +0000 UTC m=+1140.516484958" Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.710283 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"92870d9c-dcf8-45ad-af2b-9e827d319b8d","Type":"ContainerStarted","Data":"d46efd3f4e0ed3fe4fc4e27ff2c2fa3b7d03ca894a418941fdca5bfe63161f31"} Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.716770 5025 generic.go:334] "Generic (PLEG): container finished" podID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerID="b0fbb09cea78008273b57dcb42126acb3d363f88d2a67d1525a4e5d1ee10b874" exitCode=143 Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.716936 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db9c4d6b-sjqbc" event={"ID":"a66c793a-fadc-42c2-9a01-1b0cb017f773","Type":"ContainerDied","Data":"b0fbb09cea78008273b57dcb42126acb3d363f88d2a67d1525a4e5d1ee10b874"} Oct 07 08:35:33 crc kubenswrapper[5025]: I1007 08:35:33.767932 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.767900808 podStartE2EDuration="4.767900808s" podCreationTimestamp="2025-10-07 08:35:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:33.75686177 +0000 UTC m=+1140.566175914" watchObservedRunningTime="2025-10-07 08:35:33.767900808 +0000 UTC m=+1140.577214952" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.720061 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.768612 5025 generic.go:334] "Generic (PLEG): container finished" podID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerID="8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1" exitCode=0 Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.768667 5025 generic.go:334] "Generic (PLEG): container finished" podID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerID="76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb" exitCode=143 Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.768769 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea5e2f4b-727b-494a-9cdb-97c0960e3e20","Type":"ContainerDied","Data":"8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1"} Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.768805 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea5e2f4b-727b-494a-9cdb-97c0960e3e20","Type":"ContainerDied","Data":"76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb"} Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.768844 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea5e2f4b-727b-494a-9cdb-97c0960e3e20","Type":"ContainerDied","Data":"1ef261314215b14f2f1dbe512b4cbf09d44b11b3adff97d312a51ef499c15d27"} Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.768862 5025 scope.go:117] "RemoveContainer" containerID="8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.768987 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.773091 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"92870d9c-dcf8-45ad-af2b-9e827d319b8d","Type":"ContainerStarted","Data":"71b89e941333f74d8472e6dcbb081fd3f0a629b202d131e64c477de5cc046c30"} Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.808362 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.765236464 podStartE2EDuration="5.808343598s" podCreationTimestamp="2025-10-07 08:35:29 +0000 UTC" firstStartedPulling="2025-10-07 08:35:31.028957265 +0000 UTC m=+1137.838271409" lastFinishedPulling="2025-10-07 08:35:32.072064399 +0000 UTC m=+1138.881378543" observedRunningTime="2025-10-07 08:35:34.804208047 +0000 UTC m=+1141.613522211" watchObservedRunningTime="2025-10-07 08:35:34.808343598 +0000 UTC m=+1141.617657742" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.808824 5025 scope.go:117] "RemoveContainer" containerID="76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.830672 5025 scope.go:117] "RemoveContainer" containerID="8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1" Oct 07 08:35:34 crc kubenswrapper[5025]: E1007 08:35:34.838237 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1\": container with ID starting with 8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1 not found: ID does not exist" containerID="8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.838276 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1"} err="failed to get container status \"8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1\": rpc error: code = NotFound desc = could not find container \"8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1\": container with ID starting with 8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1 not found: ID does not exist" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.838302 5025 scope.go:117] "RemoveContainer" containerID="76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb" Oct 07 08:35:34 crc kubenswrapper[5025]: E1007 08:35:34.839225 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb\": container with ID starting with 76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb not found: ID does not exist" containerID="76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.839251 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb"} err="failed to get container status \"76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb\": rpc error: code = NotFound desc = could not find container \"76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb\": container with ID starting with 76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb not found: ID does not exist" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.839267 5025 scope.go:117] "RemoveContainer" containerID="8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.839470 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1"} err="failed to get container status \"8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1\": rpc error: code = NotFound desc = could not find container \"8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1\": container with ID starting with 8c9424c1a6b8512ba867839089a3e3a279a4d287951d9b3b780bb5950bb8f1e1 not found: ID does not exist" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.839484 5025 scope.go:117] "RemoveContainer" containerID="76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.839753 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb"} err="failed to get container status \"76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb\": rpc error: code = NotFound desc = could not find container \"76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb\": container with ID starting with 76ba3edd0094e2fae5e484aff482f0b89c5fdfec15b44d75481d333ec92ca1fb not found: ID does not exist" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.843368 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data-custom\") pod \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.843431 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-combined-ca-bundle\") pod \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.843606 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-etc-machine-id\") pod \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.843642 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-scripts\") pod \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.843690 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkrc4\" (UniqueName: \"kubernetes.io/projected/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-kube-api-access-mkrc4\") pod \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.843730 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-logs\") pod \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.843924 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data\") pod \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\" (UID: \"ea5e2f4b-727b-494a-9cdb-97c0960e3e20\") " Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.846496 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ea5e2f4b-727b-494a-9cdb-97c0960e3e20" (UID: "ea5e2f4b-727b-494a-9cdb-97c0960e3e20"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.847144 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-logs" (OuterVolumeSpecName: "logs") pod "ea5e2f4b-727b-494a-9cdb-97c0960e3e20" (UID: "ea5e2f4b-727b-494a-9cdb-97c0960e3e20"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.850911 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-scripts" (OuterVolumeSpecName: "scripts") pod "ea5e2f4b-727b-494a-9cdb-97c0960e3e20" (UID: "ea5e2f4b-727b-494a-9cdb-97c0960e3e20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.851370 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ea5e2f4b-727b-494a-9cdb-97c0960e3e20" (UID: "ea5e2f4b-727b-494a-9cdb-97c0960e3e20"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.852616 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-kube-api-access-mkrc4" (OuterVolumeSpecName: "kube-api-access-mkrc4") pod "ea5e2f4b-727b-494a-9cdb-97c0960e3e20" (UID: "ea5e2f4b-727b-494a-9cdb-97c0960e3e20"). InnerVolumeSpecName "kube-api-access-mkrc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.885758 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea5e2f4b-727b-494a-9cdb-97c0960e3e20" (UID: "ea5e2f4b-727b-494a-9cdb-97c0960e3e20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.907323 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data" (OuterVolumeSpecName: "config-data") pod "ea5e2f4b-727b-494a-9cdb-97c0960e3e20" (UID: "ea5e2f4b-727b-494a-9cdb-97c0960e3e20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.945734 5025 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.945770 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.945784 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkrc4\" (UniqueName: \"kubernetes.io/projected/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-kube-api-access-mkrc4\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.945798 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.945808 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.945819 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:34 crc kubenswrapper[5025]: I1007 08:35:34.945829 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea5e2f4b-727b-494a-9cdb-97c0960e3e20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.106912 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.124586 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.141662 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:35:35 crc kubenswrapper[5025]: E1007 08:35:35.142036 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerName="cinder-api-log" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.142054 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerName="cinder-api-log" Oct 07 08:35:35 crc kubenswrapper[5025]: E1007 08:35:35.142068 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerName="cinder-api" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.142075 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerName="cinder-api" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.142308 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerName="cinder-api-log" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.142327 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" containerName="cinder-api" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.143198 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.143314 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.147524 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.147744 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.148139 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.158514 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.251032 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data-custom\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.251078 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc9mg\" (UniqueName: \"kubernetes.io/projected/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-kube-api-access-cc9mg\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.251115 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.251163 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.251332 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.251391 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-scripts\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.251432 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.251486 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-logs\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.251557 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.353495 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc9mg\" (UniqueName: \"kubernetes.io/projected/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-kube-api-access-cc9mg\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.353583 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.353614 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.354476 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.354951 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-scripts\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.355006 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.355085 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-logs\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.355114 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.355337 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data-custom\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.355880 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-logs\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.356724 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.358208 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.358359 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-scripts\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.358800 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data-custom\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.359856 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.360668 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.362457 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.371060 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc9mg\" (UniqueName: \"kubernetes.io/projected/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-kube-api-access-cc9mg\") pod \"cinder-api-0\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.463077 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.928997 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea5e2f4b-727b-494a-9cdb-97c0960e3e20" path="/var/lib/kubelet/pods/ea5e2f4b-727b-494a-9cdb-97c0960e3e20/volumes" Oct 07 08:35:35 crc kubenswrapper[5025]: I1007 08:35:35.939899 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.286564 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6c4bb756cf-6b9fc"] Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.288758 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.291797 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.293700 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.293946 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.316596 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c4bb756cf-6b9fc"] Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.400587 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-internal-tls-certs\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.400635 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-public-tls-certs\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.400676 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-config-data\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.400853 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-run-httpd\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.400972 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-combined-ca-bundle\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.401133 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2jcv\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-kube-api-access-r2jcv\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.401237 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-etc-swift\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.401268 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-log-httpd\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.504353 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-etc-swift\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.504404 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-log-httpd\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.504452 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-internal-tls-certs\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.504974 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-public-tls-certs\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.505055 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-config-data\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.505366 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-run-httpd\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.505668 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-combined-ca-bundle\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.506045 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2jcv\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-kube-api-access-r2jcv\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.506831 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-run-httpd\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.508879 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-log-httpd\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.511975 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-public-tls-certs\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.512406 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-internal-tls-certs\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.513216 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-etc-swift\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.514326 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-config-data\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.523373 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2jcv\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-kube-api-access-r2jcv\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.524073 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-combined-ca-bundle\") pod \"swift-proxy-6c4bb756cf-6b9fc\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.613363 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.825811 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db9c4d6b-sjqbc" event={"ID":"a66c793a-fadc-42c2-9a01-1b0cb017f773","Type":"ContainerDied","Data":"b6486aeb3f423dba47093e6e15d4101ef4367f2e384b1ea30754b55916ffe29d"} Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.825815 5025 generic.go:334] "Generic (PLEG): container finished" podID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerID="b6486aeb3f423dba47093e6e15d4101ef4367f2e384b1ea30754b55916ffe29d" exitCode=0 Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.829461 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78","Type":"ContainerStarted","Data":"f3589738c71b80a2b66f3651e465e35c090e223c8027cb49f147e9e055629164"} Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.829506 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78","Type":"ContainerStarted","Data":"6fac0f52246aa5d51796f306c60f390b41f93a5103e192574855e1fc4e5665ce"} Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.880858 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.881143 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="ceilometer-central-agent" containerID="cri-o://675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0" gracePeriod=30 Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.881859 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="proxy-httpd" containerID="cri-o://bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a" gracePeriod=30 Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.881905 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="sg-core" containerID="cri-o://a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5" gracePeriod=30 Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.881945 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="ceilometer-notification-agent" containerID="cri-o://4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8" gracePeriod=30 Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.966377 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:36 crc kubenswrapper[5025]: I1007 08:35:36.998876 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.161:3000/\": read tcp 10.217.0.2:53554->10.217.0.161:3000: read: connection reset by peer" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.019895 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a66c793a-fadc-42c2-9a01-1b0cb017f773-logs\") pod \"a66c793a-fadc-42c2-9a01-1b0cb017f773\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.019981 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjr7l\" (UniqueName: \"kubernetes.io/projected/a66c793a-fadc-42c2-9a01-1b0cb017f773-kube-api-access-zjr7l\") pod \"a66c793a-fadc-42c2-9a01-1b0cb017f773\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.020218 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data\") pod \"a66c793a-fadc-42c2-9a01-1b0cb017f773\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.020288 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-combined-ca-bundle\") pod \"a66c793a-fadc-42c2-9a01-1b0cb017f773\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.020312 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data-custom\") pod \"a66c793a-fadc-42c2-9a01-1b0cb017f773\" (UID: \"a66c793a-fadc-42c2-9a01-1b0cb017f773\") " Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.022113 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a66c793a-fadc-42c2-9a01-1b0cb017f773-logs" (OuterVolumeSpecName: "logs") pod "a66c793a-fadc-42c2-9a01-1b0cb017f773" (UID: "a66c793a-fadc-42c2-9a01-1b0cb017f773"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.028747 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a66c793a-fadc-42c2-9a01-1b0cb017f773-kube-api-access-zjr7l" (OuterVolumeSpecName: "kube-api-access-zjr7l") pod "a66c793a-fadc-42c2-9a01-1b0cb017f773" (UID: "a66c793a-fadc-42c2-9a01-1b0cb017f773"). InnerVolumeSpecName "kube-api-access-zjr7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.038749 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a66c793a-fadc-42c2-9a01-1b0cb017f773" (UID: "a66c793a-fadc-42c2-9a01-1b0cb017f773"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.069816 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a66c793a-fadc-42c2-9a01-1b0cb017f773" (UID: "a66c793a-fadc-42c2-9a01-1b0cb017f773"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.113298 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data" (OuterVolumeSpecName: "config-data") pod "a66c793a-fadc-42c2-9a01-1b0cb017f773" (UID: "a66c793a-fadc-42c2-9a01-1b0cb017f773"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.122937 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.122967 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.122979 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a66c793a-fadc-42c2-9a01-1b0cb017f773-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.122988 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a66c793a-fadc-42c2-9a01-1b0cb017f773-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.122996 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjr7l\" (UniqueName: \"kubernetes.io/projected/a66c793a-fadc-42c2-9a01-1b0cb017f773-kube-api-access-zjr7l\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.372886 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c4bb756cf-6b9fc"] Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.857300 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db9c4d6b-sjqbc" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.857291 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db9c4d6b-sjqbc" event={"ID":"a66c793a-fadc-42c2-9a01-1b0cb017f773","Type":"ContainerDied","Data":"118fdc114dea4a6a3eff0a1f617a934c896d4cc814847f21c34c990c8bc46506"} Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.857755 5025 scope.go:117] "RemoveContainer" containerID="b6486aeb3f423dba47093e6e15d4101ef4367f2e384b1ea30754b55916ffe29d" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.860651 5025 generic.go:334] "Generic (PLEG): container finished" podID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerID="bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a" exitCode=0 Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.860682 5025 generic.go:334] "Generic (PLEG): container finished" podID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerID="a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5" exitCode=2 Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.860690 5025 generic.go:334] "Generic (PLEG): container finished" podID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerID="675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0" exitCode=0 Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.860692 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerDied","Data":"bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a"} Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.860732 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerDied","Data":"a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5"} Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.860743 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerDied","Data":"675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0"} Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.862860 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78","Type":"ContainerStarted","Data":"50b0aff959782d5056be05dce7539fc1d1b6491f125fcc23557b5dc2b17bcb0a"} Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.865318 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.867278 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" event={"ID":"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb","Type":"ContainerStarted","Data":"fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559"} Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.867306 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" event={"ID":"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb","Type":"ContainerStarted","Data":"e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e"} Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.867317 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" event={"ID":"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb","Type":"ContainerStarted","Data":"4e7e371efe7968cef3dd9d8ac0627d937b6998390fd725596e1416431a9c21d9"} Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.867865 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.867888 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.889907 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.889887464 podStartE2EDuration="2.889887464s" podCreationTimestamp="2025-10-07 08:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:37.882955856 +0000 UTC m=+1144.692270000" watchObservedRunningTime="2025-10-07 08:35:37.889887464 +0000 UTC m=+1144.699201608" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.890468 5025 scope.go:117] "RemoveContainer" containerID="b0fbb09cea78008273b57dcb42126acb3d363f88d2a67d1525a4e5d1ee10b874" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.909749 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6db9c4d6b-sjqbc"] Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.923562 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" podStartSLOduration=1.923534034 podStartE2EDuration="1.923534034s" podCreationTimestamp="2025-10-07 08:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:37.919177907 +0000 UTC m=+1144.728492051" watchObservedRunningTime="2025-10-07 08:35:37.923534034 +0000 UTC m=+1144.732848178" Oct 07 08:35:37 crc kubenswrapper[5025]: I1007 08:35:37.929750 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6db9c4d6b-sjqbc"] Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.600950 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.653235 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-log-httpd\") pod \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.653320 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-scripts\") pod \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.653416 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjjbm\" (UniqueName: \"kubernetes.io/projected/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-kube-api-access-mjjbm\") pod \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.653472 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-sg-core-conf-yaml\") pod \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.653501 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-combined-ca-bundle\") pod \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.653584 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-config-data\") pod \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.653709 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-run-httpd\") pod \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\" (UID: \"7ed584b3-29d6-4ad5-8ac1-a870c230d19f\") " Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.653947 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7ed584b3-29d6-4ad5-8ac1-a870c230d19f" (UID: "7ed584b3-29d6-4ad5-8ac1-a870c230d19f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.654272 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7ed584b3-29d6-4ad5-8ac1-a870c230d19f" (UID: "7ed584b3-29d6-4ad5-8ac1-a870c230d19f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.654282 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.659805 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-scripts" (OuterVolumeSpecName: "scripts") pod "7ed584b3-29d6-4ad5-8ac1-a870c230d19f" (UID: "7ed584b3-29d6-4ad5-8ac1-a870c230d19f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.660729 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-kube-api-access-mjjbm" (OuterVolumeSpecName: "kube-api-access-mjjbm") pod "7ed584b3-29d6-4ad5-8ac1-a870c230d19f" (UID: "7ed584b3-29d6-4ad5-8ac1-a870c230d19f"). InnerVolumeSpecName "kube-api-access-mjjbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.731025 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7ed584b3-29d6-4ad5-8ac1-a870c230d19f" (UID: "7ed584b3-29d6-4ad5-8ac1-a870c230d19f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.756752 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjjbm\" (UniqueName: \"kubernetes.io/projected/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-kube-api-access-mjjbm\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.756788 5025 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.756802 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.756813 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.821535 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-config-data" (OuterVolumeSpecName: "config-data") pod "7ed584b3-29d6-4ad5-8ac1-a870c230d19f" (UID: "7ed584b3-29d6-4ad5-8ac1-a870c230d19f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.823690 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ed584b3-29d6-4ad5-8ac1-a870c230d19f" (UID: "7ed584b3-29d6-4ad5-8ac1-a870c230d19f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.858926 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.858966 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed584b3-29d6-4ad5-8ac1-a870c230d19f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.895410 5025 generic.go:334] "Generic (PLEG): container finished" podID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerID="4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8" exitCode=0 Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.896787 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.897019 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerDied","Data":"4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8"} Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.897119 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ed584b3-29d6-4ad5-8ac1-a870c230d19f","Type":"ContainerDied","Data":"f7191dc992a201c6b575d8f99d2f5cfa7e61cf27cad36128f74aebc9d1215095"} Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.897142 5025 scope.go:117] "RemoveContainer" containerID="bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a" Oct 07 08:35:38 crc kubenswrapper[5025]: I1007 08:35:38.954335 5025 scope.go:117] "RemoveContainer" containerID="a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.003674 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.019887 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.034974 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.035496 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="sg-core" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.035519 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="sg-core" Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.065858 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="proxy-httpd" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.065913 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="proxy-httpd" Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.065947 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerName="barbican-api-log" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.065955 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerName="barbican-api-log" Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.066001 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="ceilometer-central-agent" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.066010 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="ceilometer-central-agent" Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.066027 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="ceilometer-notification-agent" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.066034 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="ceilometer-notification-agent" Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.066051 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerName="barbican-api" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.066059 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerName="barbican-api" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.066429 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="ceilometer-notification-agent" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.066472 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="ceilometer-central-agent" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.066485 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="sg-core" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.066509 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerName="barbican-api-log" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.066520 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="a66c793a-fadc-42c2-9a01-1b0cb017f773" containerName="barbican-api" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.066533 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" containerName="proxy-httpd" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.069079 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.069202 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.071917 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.086260 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.111287 5025 scope.go:117] "RemoveContainer" containerID="4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.139185 5025 scope.go:117] "RemoveContainer" containerID="675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.161073 5025 scope.go:117] "RemoveContainer" containerID="bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a" Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.161618 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a\": container with ID starting with bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a not found: ID does not exist" containerID="bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.161658 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a"} err="failed to get container status \"bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a\": rpc error: code = NotFound desc = could not find container \"bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a\": container with ID starting with bc39ac890f0e012ff86c3bd3cfa23ad2b8107d4a289f4f434c865674707cda9a not found: ID does not exist" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.161686 5025 scope.go:117] "RemoveContainer" containerID="a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5" Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.163397 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5\": container with ID starting with a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5 not found: ID does not exist" containerID="a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.163468 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5"} err="failed to get container status \"a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5\": rpc error: code = NotFound desc = could not find container \"a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5\": container with ID starting with a9e6cadb76205a18b7832eedecec50ebcc07457f9d4d05f033cf5ed68dc769c5 not found: ID does not exist" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.163503 5025 scope.go:117] "RemoveContainer" containerID="4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8" Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.166850 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8\": container with ID starting with 4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8 not found: ID does not exist" containerID="4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.166883 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8"} err="failed to get container status \"4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8\": rpc error: code = NotFound desc = could not find container \"4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8\": container with ID starting with 4e1aec86e23c5a9885dadfe312cc99b621465cc033fb0fec64d13f9a397e16a8 not found: ID does not exist" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.166902 5025 scope.go:117] "RemoveContainer" containerID="675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0" Oct 07 08:35:39 crc kubenswrapper[5025]: E1007 08:35:39.167321 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0\": container with ID starting with 675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0 not found: ID does not exist" containerID="675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.167396 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0"} err="failed to get container status \"675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0\": rpc error: code = NotFound desc = could not find container \"675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0\": container with ID starting with 675452d23586fab1106526c4197028bb8b9d95a6d374c44a64f5fe305e6465d0 not found: ID does not exist" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.170781 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.170891 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-scripts\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.170962 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-config-data\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.171000 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-log-httpd\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.171028 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9x4g\" (UniqueName: \"kubernetes.io/projected/d2227c9b-c473-4ec5-8a72-4b7ac8565343-kube-api-access-c9x4g\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.171053 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.171092 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-run-httpd\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.272913 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-log-httpd\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.272963 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9x4g\" (UniqueName: \"kubernetes.io/projected/d2227c9b-c473-4ec5-8a72-4b7ac8565343-kube-api-access-c9x4g\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.272987 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.273022 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-run-httpd\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.273162 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.273225 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-scripts\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.273467 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-log-httpd\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.273560 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-run-httpd\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.274012 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-config-data\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.277814 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.277920 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.278490 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-scripts\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.279903 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-config-data\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.289648 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9x4g\" (UniqueName: \"kubernetes.io/projected/d2227c9b-c473-4ec5-8a72-4b7ac8565343-kube-api-access-c9x4g\") pod \"ceilometer-0\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.401140 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.664559 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.927952 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed584b3-29d6-4ad5-8ac1-a870c230d19f" path="/var/lib/kubelet/pods/7ed584b3-29d6-4ad5-8ac1-a870c230d19f/volumes" Oct 07 08:35:39 crc kubenswrapper[5025]: I1007 08:35:39.928705 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a66c793a-fadc-42c2-9a01-1b0cb017f773" path="/var/lib/kubelet/pods/a66c793a-fadc-42c2-9a01-1b0cb017f773/volumes" Oct 07 08:35:40 crc kubenswrapper[5025]: I1007 08:35:40.259874 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:35:40 crc kubenswrapper[5025]: I1007 08:35:40.323044 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vb7rc"] Oct 07 08:35:40 crc kubenswrapper[5025]: I1007 08:35:40.323266 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" podUID="fe96ba85-9487-44de-87f3-e202c1a30fa6" containerName="dnsmasq-dns" containerID="cri-o://90e3d502a44efb5858a41705ed907f08bca10191ff4aaedc314d9f26c985b201" gracePeriod=10 Oct 07 08:35:40 crc kubenswrapper[5025]: I1007 08:35:40.355669 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 08:35:40 crc kubenswrapper[5025]: I1007 08:35:40.395118 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:35:40 crc kubenswrapper[5025]: I1007 08:35:40.919125 5025 generic.go:334] "Generic (PLEG): container finished" podID="fe96ba85-9487-44de-87f3-e202c1a30fa6" containerID="90e3d502a44efb5858a41705ed907f08bca10191ff4aaedc314d9f26c985b201" exitCode=0 Oct 07 08:35:40 crc kubenswrapper[5025]: I1007 08:35:40.919189 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" event={"ID":"fe96ba85-9487-44de-87f3-e202c1a30fa6","Type":"ContainerDied","Data":"90e3d502a44efb5858a41705ed907f08bca10191ff4aaedc314d9f26c985b201"} Oct 07 08:35:40 crc kubenswrapper[5025]: I1007 08:35:40.919441 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerName="cinder-scheduler" containerID="cri-o://d46efd3f4e0ed3fe4fc4e27ff2c2fa3b7d03ca894a418941fdca5bfe63161f31" gracePeriod=30 Oct 07 08:35:40 crc kubenswrapper[5025]: I1007 08:35:40.919533 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerName="probe" containerID="cri-o://71b89e941333f74d8472e6dcbb081fd3f0a629b202d131e64c477de5cc046c30" gracePeriod=30 Oct 07 08:35:41 crc kubenswrapper[5025]: I1007 08:35:41.685420 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:41 crc kubenswrapper[5025]: I1007 08:35:41.932874 5025 generic.go:334] "Generic (PLEG): container finished" podID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerID="71b89e941333f74d8472e6dcbb081fd3f0a629b202d131e64c477de5cc046c30" exitCode=0 Oct 07 08:35:41 crc kubenswrapper[5025]: I1007 08:35:41.932942 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"92870d9c-dcf8-45ad-af2b-9e827d319b8d","Type":"ContainerDied","Data":"71b89e941333f74d8472e6dcbb081fd3f0a629b202d131e64c477de5cc046c30"} Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.084489 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-dbbfh"] Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.086094 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dbbfh" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.098770 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dbbfh"] Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.124445 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgs4w\" (UniqueName: \"kubernetes.io/projected/fffe9def-a18d-4a98-bb4c-f47faeb2fae8-kube-api-access-qgs4w\") pod \"nova-api-db-create-dbbfh\" (UID: \"fffe9def-a18d-4a98-bb4c-f47faeb2fae8\") " pod="openstack/nova-api-db-create-dbbfh" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.156857 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-w5ggj"] Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.157986 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w5ggj" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.166012 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-w5ggj"] Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.226461 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgs4w\" (UniqueName: \"kubernetes.io/projected/fffe9def-a18d-4a98-bb4c-f47faeb2fae8-kube-api-access-qgs4w\") pod \"nova-api-db-create-dbbfh\" (UID: \"fffe9def-a18d-4a98-bb4c-f47faeb2fae8\") " pod="openstack/nova-api-db-create-dbbfh" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.226571 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfzvc\" (UniqueName: \"kubernetes.io/projected/a72ad49a-646a-40cf-a5bb-7bdecdf2153c-kube-api-access-qfzvc\") pod \"nova-cell0-db-create-w5ggj\" (UID: \"a72ad49a-646a-40cf-a5bb-7bdecdf2153c\") " pod="openstack/nova-cell0-db-create-w5ggj" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.245920 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgs4w\" (UniqueName: \"kubernetes.io/projected/fffe9def-a18d-4a98-bb4c-f47faeb2fae8-kube-api-access-qgs4w\") pod \"nova-api-db-create-dbbfh\" (UID: \"fffe9def-a18d-4a98-bb4c-f47faeb2fae8\") " pod="openstack/nova-api-db-create-dbbfh" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.262368 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-pc6np"] Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.263635 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pc6np" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.276502 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pc6np"] Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.318779 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" podUID="fe96ba85-9487-44de-87f3-e202c1a30fa6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: connect: connection refused" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.328302 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p77nx\" (UniqueName: \"kubernetes.io/projected/bf250487-bcbd-4fc3-9d87-8e813ee7b39a-kube-api-access-p77nx\") pod \"nova-cell1-db-create-pc6np\" (UID: \"bf250487-bcbd-4fc3-9d87-8e813ee7b39a\") " pod="openstack/nova-cell1-db-create-pc6np" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.328385 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfzvc\" (UniqueName: \"kubernetes.io/projected/a72ad49a-646a-40cf-a5bb-7bdecdf2153c-kube-api-access-qfzvc\") pod \"nova-cell0-db-create-w5ggj\" (UID: \"a72ad49a-646a-40cf-a5bb-7bdecdf2153c\") " pod="openstack/nova-cell0-db-create-w5ggj" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.354166 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfzvc\" (UniqueName: \"kubernetes.io/projected/a72ad49a-646a-40cf-a5bb-7bdecdf2153c-kube-api-access-qfzvc\") pod \"nova-cell0-db-create-w5ggj\" (UID: \"a72ad49a-646a-40cf-a5bb-7bdecdf2153c\") " pod="openstack/nova-cell0-db-create-w5ggj" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.414559 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dbbfh" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.430036 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p77nx\" (UniqueName: \"kubernetes.io/projected/bf250487-bcbd-4fc3-9d87-8e813ee7b39a-kube-api-access-p77nx\") pod \"nova-cell1-db-create-pc6np\" (UID: \"bf250487-bcbd-4fc3-9d87-8e813ee7b39a\") " pod="openstack/nova-cell1-db-create-pc6np" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.448568 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p77nx\" (UniqueName: \"kubernetes.io/projected/bf250487-bcbd-4fc3-9d87-8e813ee7b39a-kube-api-access-p77nx\") pod \"nova-cell1-db-create-pc6np\" (UID: \"bf250487-bcbd-4fc3-9d87-8e813ee7b39a\") " pod="openstack/nova-cell1-db-create-pc6np" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.478764 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w5ggj" Oct 07 08:35:42 crc kubenswrapper[5025]: I1007 08:35:42.608931 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pc6np" Oct 07 08:35:43 crc kubenswrapper[5025]: I1007 08:35:43.664902 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:35:43 crc kubenswrapper[5025]: I1007 08:35:43.737754 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b6897fb44-5d68h"] Oct 07 08:35:43 crc kubenswrapper[5025]: I1007 08:35:43.737992 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b6897fb44-5d68h" podUID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerName="neutron-api" containerID="cri-o://708768a2c94adfa8c4edde2831f01fcbd06ad80d55be39ff0b7183840ed0d8a8" gracePeriod=30 Oct 07 08:35:43 crc kubenswrapper[5025]: I1007 08:35:43.738409 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b6897fb44-5d68h" podUID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerName="neutron-httpd" containerID="cri-o://51088a15c259b4db87cd12b21a346dd16dc9bec3d5bf784b38768e7c561043ba" gracePeriod=30 Oct 07 08:35:43 crc kubenswrapper[5025]: I1007 08:35:43.952507 5025 generic.go:334] "Generic (PLEG): container finished" podID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerID="51088a15c259b4db87cd12b21a346dd16dc9bec3d5bf784b38768e7c561043ba" exitCode=0 Oct 07 08:35:43 crc kubenswrapper[5025]: I1007 08:35:43.952562 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b6897fb44-5d68h" event={"ID":"c0b57ea4-f1b6-47f3-adef-97c26b470477","Type":"ContainerDied","Data":"51088a15c259b4db87cd12b21a346dd16dc9bec3d5bf784b38768e7c561043ba"} Oct 07 08:35:44 crc kubenswrapper[5025]: I1007 08:35:44.963802 5025 generic.go:334] "Generic (PLEG): container finished" podID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerID="d46efd3f4e0ed3fe4fc4e27ff2c2fa3b7d03ca894a418941fdca5bfe63161f31" exitCode=0 Oct 07 08:35:44 crc kubenswrapper[5025]: I1007 08:35:44.963876 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"92870d9c-dcf8-45ad-af2b-9e827d319b8d","Type":"ContainerDied","Data":"d46efd3f4e0ed3fe4fc4e27ff2c2fa3b7d03ca894a418941fdca5bfe63161f31"} Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.700523 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.809608 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.833656 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-sb\") pod \"fe96ba85-9487-44de-87f3-e202c1a30fa6\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.834070 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-swift-storage-0\") pod \"fe96ba85-9487-44de-87f3-e202c1a30fa6\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.834101 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-svc\") pod \"fe96ba85-9487-44de-87f3-e202c1a30fa6\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.834181 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-nb\") pod \"fe96ba85-9487-44de-87f3-e202c1a30fa6\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.834276 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgh8h\" (UniqueName: \"kubernetes.io/projected/fe96ba85-9487-44de-87f3-e202c1a30fa6-kube-api-access-qgh8h\") pod \"fe96ba85-9487-44de-87f3-e202c1a30fa6\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.834339 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-config\") pod \"fe96ba85-9487-44de-87f3-e202c1a30fa6\" (UID: \"fe96ba85-9487-44de-87f3-e202c1a30fa6\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.846557 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe96ba85-9487-44de-87f3-e202c1a30fa6-kube-api-access-qgh8h" (OuterVolumeSpecName: "kube-api-access-qgh8h") pod "fe96ba85-9487-44de-87f3-e202c1a30fa6" (UID: "fe96ba85-9487-44de-87f3-e202c1a30fa6"). InnerVolumeSpecName "kube-api-access-qgh8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.909254 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe96ba85-9487-44de-87f3-e202c1a30fa6" (UID: "fe96ba85-9487-44de-87f3-e202c1a30fa6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.924721 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fe96ba85-9487-44de-87f3-e202c1a30fa6" (UID: "fe96ba85-9487-44de-87f3-e202c1a30fa6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.935439 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92870d9c-dcf8-45ad-af2b-9e827d319b8d-etc-machine-id\") pod \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.935519 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92870d9c-dcf8-45ad-af2b-9e827d319b8d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "92870d9c-dcf8-45ad-af2b-9e827d319b8d" (UID: "92870d9c-dcf8-45ad-af2b-9e827d319b8d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.935544 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data\") pod \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.935652 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-288p9\" (UniqueName: \"kubernetes.io/projected/92870d9c-dcf8-45ad-af2b-9e827d319b8d-kube-api-access-288p9\") pod \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.935680 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-combined-ca-bundle\") pod \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.935725 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data-custom\") pod \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.935895 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-scripts\") pod \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\" (UID: \"92870d9c-dcf8-45ad-af2b-9e827d319b8d\") " Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.936622 5025 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.936641 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.936651 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgh8h\" (UniqueName: \"kubernetes.io/projected/fe96ba85-9487-44de-87f3-e202c1a30fa6-kube-api-access-qgh8h\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.936661 5025 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92870d9c-dcf8-45ad-af2b-9e827d319b8d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.941696 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-config" (OuterVolumeSpecName: "config") pod "fe96ba85-9487-44de-87f3-e202c1a30fa6" (UID: "fe96ba85-9487-44de-87f3-e202c1a30fa6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.943410 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-scripts" (OuterVolumeSpecName: "scripts") pod "92870d9c-dcf8-45ad-af2b-9e827d319b8d" (UID: "92870d9c-dcf8-45ad-af2b-9e827d319b8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.958642 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "92870d9c-dcf8-45ad-af2b-9e827d319b8d" (UID: "92870d9c-dcf8-45ad-af2b-9e827d319b8d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.966233 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe96ba85-9487-44de-87f3-e202c1a30fa6" (UID: "fe96ba85-9487-44de-87f3-e202c1a30fa6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.968239 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92870d9c-dcf8-45ad-af2b-9e827d319b8d-kube-api-access-288p9" (OuterVolumeSpecName: "kube-api-access-288p9") pod "92870d9c-dcf8-45ad-af2b-9e827d319b8d" (UID: "92870d9c-dcf8-45ad-af2b-9e827d319b8d"). InnerVolumeSpecName "kube-api-access-288p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.978704 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"92870d9c-dcf8-45ad-af2b-9e827d319b8d","Type":"ContainerDied","Data":"3bec688b31cbd159c7dc06b969ad4f90f42ea5657ea7637f5cc7f3f535028d68"} Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.978755 5025 scope.go:117] "RemoveContainer" containerID="71b89e941333f74d8472e6dcbb081fd3f0a629b202d131e64c477de5cc046c30" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.978888 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.985022 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe96ba85-9487-44de-87f3-e202c1a30fa6" (UID: "fe96ba85-9487-44de-87f3-e202c1a30fa6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.997091 5025 generic.go:334] "Generic (PLEG): container finished" podID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerID="708768a2c94adfa8c4edde2831f01fcbd06ad80d55be39ff0b7183840ed0d8a8" exitCode=0 Oct 07 08:35:45 crc kubenswrapper[5025]: I1007 08:35:45.997176 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b6897fb44-5d68h" event={"ID":"c0b57ea4-f1b6-47f3-adef-97c26b470477","Type":"ContainerDied","Data":"708768a2c94adfa8c4edde2831f01fcbd06ad80d55be39ff0b7183840ed0d8a8"} Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.003551 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d13fcf27-6664-430e-b9ac-81ff65769a0c","Type":"ContainerStarted","Data":"39701f3461ec57f87ded44047ecffc35edf679b1b33f29a9dedea8b73829426d"} Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.004731 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.006133 5025 scope.go:117] "RemoveContainer" containerID="d46efd3f4e0ed3fe4fc4e27ff2c2fa3b7d03ca894a418941fdca5bfe63161f31" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.006570 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" event={"ID":"fe96ba85-9487-44de-87f3-e202c1a30fa6","Type":"ContainerDied","Data":"701e5689bcc444cef57fef16888a655ed4dc9c9b617cd1bbd260536f5799e2c5"} Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.006640 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-vb7rc" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.030118 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.351221838 podStartE2EDuration="17.030101743s" podCreationTimestamp="2025-10-07 08:35:29 +0000 UTC" firstStartedPulling="2025-10-07 08:35:30.728433387 +0000 UTC m=+1137.537747531" lastFinishedPulling="2025-10-07 08:35:45.407313292 +0000 UTC m=+1152.216627436" observedRunningTime="2025-10-07 08:35:46.028242935 +0000 UTC m=+1152.837557079" watchObservedRunningTime="2025-10-07 08:35:46.030101743 +0000 UTC m=+1152.839415887" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.047350 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.047376 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.047386 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-288p9\" (UniqueName: \"kubernetes.io/projected/92870d9c-dcf8-45ad-af2b-9e827d319b8d-kube-api-access-288p9\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.047394 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.047403 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe96ba85-9487-44de-87f3-e202c1a30fa6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.047410 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.088046 5025 scope.go:117] "RemoveContainer" containerID="90e3d502a44efb5858a41705ed907f08bca10191ff4aaedc314d9f26c985b201" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.108048 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92870d9c-dcf8-45ad-af2b-9e827d319b8d" (UID: "92870d9c-dcf8-45ad-af2b-9e827d319b8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.122963 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vb7rc"] Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.141123 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-vb7rc"] Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.152734 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pc6np"] Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.153234 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-httpd-config\") pod \"c0b57ea4-f1b6-47f3-adef-97c26b470477\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.153367 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-combined-ca-bundle\") pod \"c0b57ea4-f1b6-47f3-adef-97c26b470477\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.153386 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-ovndb-tls-certs\") pod \"c0b57ea4-f1b6-47f3-adef-97c26b470477\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.153404 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5kfz\" (UniqueName: \"kubernetes.io/projected/c0b57ea4-f1b6-47f3-adef-97c26b470477-kube-api-access-h5kfz\") pod \"c0b57ea4-f1b6-47f3-adef-97c26b470477\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.153466 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-config\") pod \"c0b57ea4-f1b6-47f3-adef-97c26b470477\" (UID: \"c0b57ea4-f1b6-47f3-adef-97c26b470477\") " Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.154022 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.159507 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b57ea4-f1b6-47f3-adef-97c26b470477-kube-api-access-h5kfz" (OuterVolumeSpecName: "kube-api-access-h5kfz") pod "c0b57ea4-f1b6-47f3-adef-97c26b470477" (UID: "c0b57ea4-f1b6-47f3-adef-97c26b470477"). InnerVolumeSpecName "kube-api-access-h5kfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.159633 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c0b57ea4-f1b6-47f3-adef-97c26b470477" (UID: "c0b57ea4-f1b6-47f3-adef-97c26b470477"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.175350 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data" (OuterVolumeSpecName: "config-data") pod "92870d9c-dcf8-45ad-af2b-9e827d319b8d" (UID: "92870d9c-dcf8-45ad-af2b-9e827d319b8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:46 crc kubenswrapper[5025]: W1007 08:35:46.187582 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf250487_bcbd_4fc3_9d87_8e813ee7b39a.slice/crio-82295bf4b7b864c55d8e0ec57492c76296ad4048014fb9fa416c241e893879fd WatchSource:0}: Error finding container 82295bf4b7b864c55d8e0ec57492c76296ad4048014fb9fa416c241e893879fd: Status 404 returned error can't find the container with id 82295bf4b7b864c55d8e0ec57492c76296ad4048014fb9fa416c241e893879fd Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.197513 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.253753 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0b57ea4-f1b6-47f3-adef-97c26b470477" (UID: "c0b57ea4-f1b6-47f3-adef-97c26b470477"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.255763 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.255789 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5kfz\" (UniqueName: \"kubernetes.io/projected/c0b57ea4-f1b6-47f3-adef-97c26b470477-kube-api-access-h5kfz\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.255800 5025 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.255811 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92870d9c-dcf8-45ad-af2b-9e827d319b8d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.268302 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c0b57ea4-f1b6-47f3-adef-97c26b470477" (UID: "c0b57ea4-f1b6-47f3-adef-97c26b470477"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.308904 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-config" (OuterVolumeSpecName: "config") pod "c0b57ea4-f1b6-47f3-adef-97c26b470477" (UID: "c0b57ea4-f1b6-47f3-adef-97c26b470477"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.375418 5025 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.375464 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0b57ea4-f1b6-47f3-adef-97c26b470477-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.380944 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dbbfh"] Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.398165 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-w5ggj"] Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.399220 5025 scope.go:117] "RemoveContainer" containerID="caa08dee4e8bf4e7254f9f1c798ce5cb8f708c653e45d6f2fc4d545702b7e504" Oct 07 08:35:46 crc kubenswrapper[5025]: W1007 08:35:46.405445 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda72ad49a_646a_40cf_a5bb_7bdecdf2153c.slice/crio-b87fea9711ded3ea6bfa512b178079fecaa9e5fcdcd3189e2381815734e97331 WatchSource:0}: Error finding container b87fea9711ded3ea6bfa512b178079fecaa9e5fcdcd3189e2381815734e97331: Status 404 returned error can't find the container with id b87fea9711ded3ea6bfa512b178079fecaa9e5fcdcd3189e2381815734e97331 Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.427088 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.449963 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.458268 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:35:46 crc kubenswrapper[5025]: E1007 08:35:46.459191 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerName="probe" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459212 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerName="probe" Oct 07 08:35:46 crc kubenswrapper[5025]: E1007 08:35:46.459238 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerName="neutron-httpd" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459246 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerName="neutron-httpd" Oct 07 08:35:46 crc kubenswrapper[5025]: E1007 08:35:46.459260 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe96ba85-9487-44de-87f3-e202c1a30fa6" containerName="init" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459271 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe96ba85-9487-44de-87f3-e202c1a30fa6" containerName="init" Oct 07 08:35:46 crc kubenswrapper[5025]: E1007 08:35:46.459286 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe96ba85-9487-44de-87f3-e202c1a30fa6" containerName="dnsmasq-dns" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459294 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe96ba85-9487-44de-87f3-e202c1a30fa6" containerName="dnsmasq-dns" Oct 07 08:35:46 crc kubenswrapper[5025]: E1007 08:35:46.459319 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerName="neutron-api" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459327 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerName="neutron-api" Oct 07 08:35:46 crc kubenswrapper[5025]: E1007 08:35:46.459346 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerName="cinder-scheduler" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459353 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerName="cinder-scheduler" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459601 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe96ba85-9487-44de-87f3-e202c1a30fa6" containerName="dnsmasq-dns" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459626 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerName="cinder-scheduler" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459645 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" containerName="probe" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459658 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerName="neutron-api" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.459673 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b57ea4-f1b6-47f3-adef-97c26b470477" containerName="neutron-httpd" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.461109 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.471265 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.487718 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.578640 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmxnh\" (UniqueName: \"kubernetes.io/projected/9fc34125-4736-4467-bf0e-ec3b211e6d13-kube-api-access-cmxnh\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.578759 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fc34125-4736-4467-bf0e-ec3b211e6d13-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.578779 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.578812 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.578853 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-scripts\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.578871 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.618715 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.622299 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.680476 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fc34125-4736-4467-bf0e-ec3b211e6d13-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.680534 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.680592 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.680650 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-scripts\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.680676 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.680736 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmxnh\" (UniqueName: \"kubernetes.io/projected/9fc34125-4736-4467-bf0e-ec3b211e6d13-kube-api-access-cmxnh\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.681409 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fc34125-4736-4467-bf0e-ec3b211e6d13-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.686167 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.686680 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.689867 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-scripts\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.690249 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.700036 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmxnh\" (UniqueName: \"kubernetes.io/projected/9fc34125-4736-4467-bf0e-ec3b211e6d13-kube-api-access-cmxnh\") pod \"cinder-scheduler-0\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " pod="openstack/cinder-scheduler-0" Oct 07 08:35:46 crc kubenswrapper[5025]: I1007 08:35:46.804194 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.023821 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerStarted","Data":"161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7"} Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.024194 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerStarted","Data":"31733b73281bbf4675d74d07b95693be7450096d64b9ea778238f077c12e5354"} Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.027765 5025 generic.go:334] "Generic (PLEG): container finished" podID="bf250487-bcbd-4fc3-9d87-8e813ee7b39a" containerID="b203a6d15ef792c4fe41369bbb2b58b37c71bfbaf5b3774eecf764a02b611548" exitCode=0 Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.027817 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pc6np" event={"ID":"bf250487-bcbd-4fc3-9d87-8e813ee7b39a","Type":"ContainerDied","Data":"b203a6d15ef792c4fe41369bbb2b58b37c71bfbaf5b3774eecf764a02b611548"} Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.027836 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pc6np" event={"ID":"bf250487-bcbd-4fc3-9d87-8e813ee7b39a","Type":"ContainerStarted","Data":"82295bf4b7b864c55d8e0ec57492c76296ad4048014fb9fa416c241e893879fd"} Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.071738 5025 generic.go:334] "Generic (PLEG): container finished" podID="a72ad49a-646a-40cf-a5bb-7bdecdf2153c" containerID="2935be7adab1bdb854893d962c2c032e6db498d9775a70e4571b34552bcc75c4" exitCode=0 Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.072101 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-w5ggj" event={"ID":"a72ad49a-646a-40cf-a5bb-7bdecdf2153c","Type":"ContainerDied","Data":"2935be7adab1bdb854893d962c2c032e6db498d9775a70e4571b34552bcc75c4"} Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.072130 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-w5ggj" event={"ID":"a72ad49a-646a-40cf-a5bb-7bdecdf2153c","Type":"ContainerStarted","Data":"b87fea9711ded3ea6bfa512b178079fecaa9e5fcdcd3189e2381815734e97331"} Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.097809 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b6897fb44-5d68h" event={"ID":"c0b57ea4-f1b6-47f3-adef-97c26b470477","Type":"ContainerDied","Data":"00cd8b50dff96c95e0ac1606c3434063483969f0c13cf37e267b3f00f6219c63"} Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.097862 5025 scope.go:117] "RemoveContainer" containerID="51088a15c259b4db87cd12b21a346dd16dc9bec3d5bf784b38768e7c561043ba" Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.097960 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b6897fb44-5d68h" Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.130220 5025 generic.go:334] "Generic (PLEG): container finished" podID="fffe9def-a18d-4a98-bb4c-f47faeb2fae8" containerID="30c030347bd0a19a8d78643a4fcff89f0ca1bda4a5dfc49485c83c488909adad" exitCode=0 Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.131519 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dbbfh" event={"ID":"fffe9def-a18d-4a98-bb4c-f47faeb2fae8","Type":"ContainerDied","Data":"30c030347bd0a19a8d78643a4fcff89f0ca1bda4a5dfc49485c83c488909adad"} Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.131567 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dbbfh" event={"ID":"fffe9def-a18d-4a98-bb4c-f47faeb2fae8","Type":"ContainerStarted","Data":"f9be34a0f3e78d851fd73e2da4d1eaf27d6e93e8ce3b698cbc288734921a1b8e"} Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.182722 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b6897fb44-5d68h"] Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.183746 5025 scope.go:117] "RemoveContainer" containerID="708768a2c94adfa8c4edde2831f01fcbd06ad80d55be39ff0b7183840ed0d8a8" Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.189107 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6b6897fb44-5d68h"] Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.294498 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:35:47 crc kubenswrapper[5025]: W1007 08:35:47.300724 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fc34125_4736_4467_bf0e_ec3b211e6d13.slice/crio-1a42cfeb52c0ed9962d03e8cd19d8def09a7c1dd9afe1814618bde979fabfcef WatchSource:0}: Error finding container 1a42cfeb52c0ed9962d03e8cd19d8def09a7c1dd9afe1814618bde979fabfcef: Status 404 returned error can't find the container with id 1a42cfeb52c0ed9962d03e8cd19d8def09a7c1dd9afe1814618bde979fabfcef Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.923120 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92870d9c-dcf8-45ad-af2b-9e827d319b8d" path="/var/lib/kubelet/pods/92870d9c-dcf8-45ad-af2b-9e827d319b8d/volumes" Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.924089 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b57ea4-f1b6-47f3-adef-97c26b470477" path="/var/lib/kubelet/pods/c0b57ea4-f1b6-47f3-adef-97c26b470477/volumes" Oct 07 08:35:47 crc kubenswrapper[5025]: I1007 08:35:47.924722 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe96ba85-9487-44de-87f3-e202c1a30fa6" path="/var/lib/kubelet/pods/fe96ba85-9487-44de-87f3-e202c1a30fa6/volumes" Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.000731 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.203139 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fc34125-4736-4467-bf0e-ec3b211e6d13","Type":"ContainerStarted","Data":"37a99151a2993186ee570500792634aa31edd00f179d56fe2993e992303f092f"} Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.203463 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fc34125-4736-4467-bf0e-ec3b211e6d13","Type":"ContainerStarted","Data":"1a42cfeb52c0ed9962d03e8cd19d8def09a7c1dd9afe1814618bde979fabfcef"} Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.206893 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerStarted","Data":"0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5"} Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.788062 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w5ggj" Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.840204 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfzvc\" (UniqueName: \"kubernetes.io/projected/a72ad49a-646a-40cf-a5bb-7bdecdf2153c-kube-api-access-qfzvc\") pod \"a72ad49a-646a-40cf-a5bb-7bdecdf2153c\" (UID: \"a72ad49a-646a-40cf-a5bb-7bdecdf2153c\") " Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.853735 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dbbfh" Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.885784 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72ad49a-646a-40cf-a5bb-7bdecdf2153c-kube-api-access-qfzvc" (OuterVolumeSpecName: "kube-api-access-qfzvc") pod "a72ad49a-646a-40cf-a5bb-7bdecdf2153c" (UID: "a72ad49a-646a-40cf-a5bb-7bdecdf2153c"). InnerVolumeSpecName "kube-api-access-qfzvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.943162 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgs4w\" (UniqueName: \"kubernetes.io/projected/fffe9def-a18d-4a98-bb4c-f47faeb2fae8-kube-api-access-qgs4w\") pod \"fffe9def-a18d-4a98-bb4c-f47faeb2fae8\" (UID: \"fffe9def-a18d-4a98-bb4c-f47faeb2fae8\") " Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.947317 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfzvc\" (UniqueName: \"kubernetes.io/projected/a72ad49a-646a-40cf-a5bb-7bdecdf2153c-kube-api-access-qfzvc\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.965778 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fffe9def-a18d-4a98-bb4c-f47faeb2fae8-kube-api-access-qgs4w" (OuterVolumeSpecName: "kube-api-access-qgs4w") pod "fffe9def-a18d-4a98-bb4c-f47faeb2fae8" (UID: "fffe9def-a18d-4a98-bb4c-f47faeb2fae8"). InnerVolumeSpecName "kube-api-access-qgs4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:48 crc kubenswrapper[5025]: I1007 08:35:48.973984 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pc6np" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.048271 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p77nx\" (UniqueName: \"kubernetes.io/projected/bf250487-bcbd-4fc3-9d87-8e813ee7b39a-kube-api-access-p77nx\") pod \"bf250487-bcbd-4fc3-9d87-8e813ee7b39a\" (UID: \"bf250487-bcbd-4fc3-9d87-8e813ee7b39a\") " Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.048667 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgs4w\" (UniqueName: \"kubernetes.io/projected/fffe9def-a18d-4a98-bb4c-f47faeb2fae8-kube-api-access-qgs4w\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.051516 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf250487-bcbd-4fc3-9d87-8e813ee7b39a-kube-api-access-p77nx" (OuterVolumeSpecName: "kube-api-access-p77nx") pod "bf250487-bcbd-4fc3-9d87-8e813ee7b39a" (UID: "bf250487-bcbd-4fc3-9d87-8e813ee7b39a"). InnerVolumeSpecName "kube-api-access-p77nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.150953 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p77nx\" (UniqueName: \"kubernetes.io/projected/bf250487-bcbd-4fc3-9d87-8e813ee7b39a-kube-api-access-p77nx\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.217325 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-w5ggj" event={"ID":"a72ad49a-646a-40cf-a5bb-7bdecdf2153c","Type":"ContainerDied","Data":"b87fea9711ded3ea6bfa512b178079fecaa9e5fcdcd3189e2381815734e97331"} Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.217367 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b87fea9711ded3ea6bfa512b178079fecaa9e5fcdcd3189e2381815734e97331" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.217432 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w5ggj" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.223342 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fc34125-4736-4467-bf0e-ec3b211e6d13","Type":"ContainerStarted","Data":"1e3e97ecbee1eb5fa2d4e0c99b1a8d7a3e6632904a37d6f0cbcf4e9c9f40fc26"} Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.226822 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dbbfh" event={"ID":"fffe9def-a18d-4a98-bb4c-f47faeb2fae8","Type":"ContainerDied","Data":"f9be34a0f3e78d851fd73e2da4d1eaf27d6e93e8ce3b698cbc288734921a1b8e"} Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.227024 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9be34a0f3e78d851fd73e2da4d1eaf27d6e93e8ce3b698cbc288734921a1b8e" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.226895 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dbbfh" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.232226 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerStarted","Data":"f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82"} Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.233850 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pc6np" event={"ID":"bf250487-bcbd-4fc3-9d87-8e813ee7b39a","Type":"ContainerDied","Data":"82295bf4b7b864c55d8e0ec57492c76296ad4048014fb9fa416c241e893879fd"} Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.233874 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82295bf4b7b864c55d8e0ec57492c76296ad4048014fb9fa416c241e893879fd" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.233929 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pc6np" Oct 07 08:35:49 crc kubenswrapper[5025]: I1007 08:35:49.832154 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.83213393 podStartE2EDuration="3.83213393s" podCreationTimestamp="2025-10-07 08:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:35:49.252704274 +0000 UTC m=+1156.062018418" watchObservedRunningTime="2025-10-07 08:35:49.83213393 +0000 UTC m=+1156.641448074" Oct 07 08:35:50 crc kubenswrapper[5025]: I1007 08:35:50.248810 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerStarted","Data":"592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba"} Oct 07 08:35:50 crc kubenswrapper[5025]: I1007 08:35:50.249068 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="ceilometer-central-agent" containerID="cri-o://161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7" gracePeriod=30 Oct 07 08:35:50 crc kubenswrapper[5025]: I1007 08:35:50.249116 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="sg-core" containerID="cri-o://f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82" gracePeriod=30 Oct 07 08:35:50 crc kubenswrapper[5025]: I1007 08:35:50.249094 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="proxy-httpd" containerID="cri-o://592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba" gracePeriod=30 Oct 07 08:35:50 crc kubenswrapper[5025]: I1007 08:35:50.249158 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="ceilometer-notification-agent" containerID="cri-o://0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5" gracePeriod=30 Oct 07 08:35:50 crc kubenswrapper[5025]: I1007 08:35:50.280919 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=8.703211302 podStartE2EDuration="12.280902139s" podCreationTimestamp="2025-10-07 08:35:38 +0000 UTC" firstStartedPulling="2025-10-07 08:35:46.212087248 +0000 UTC m=+1153.021401392" lastFinishedPulling="2025-10-07 08:35:49.789778085 +0000 UTC m=+1156.599092229" observedRunningTime="2025-10-07 08:35:50.275697545 +0000 UTC m=+1157.085011699" watchObservedRunningTime="2025-10-07 08:35:50.280902139 +0000 UTC m=+1157.090216283" Oct 07 08:35:51 crc kubenswrapper[5025]: I1007 08:35:51.262093 5025 generic.go:334] "Generic (PLEG): container finished" podID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerID="592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba" exitCode=0 Oct 07 08:35:51 crc kubenswrapper[5025]: I1007 08:35:51.262135 5025 generic.go:334] "Generic (PLEG): container finished" podID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerID="f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82" exitCode=2 Oct 07 08:35:51 crc kubenswrapper[5025]: I1007 08:35:51.262146 5025 generic.go:334] "Generic (PLEG): container finished" podID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerID="0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5" exitCode=0 Oct 07 08:35:51 crc kubenswrapper[5025]: I1007 08:35:51.262154 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerDied","Data":"592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba"} Oct 07 08:35:51 crc kubenswrapper[5025]: I1007 08:35:51.262185 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerDied","Data":"f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82"} Oct 07 08:35:51 crc kubenswrapper[5025]: I1007 08:35:51.262196 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerDied","Data":"0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5"} Oct 07 08:35:51 crc kubenswrapper[5025]: I1007 08:35:51.804754 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.242207 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-696a-account-create-ht4mf"] Oct 07 08:35:52 crc kubenswrapper[5025]: E1007 08:35:52.242857 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fffe9def-a18d-4a98-bb4c-f47faeb2fae8" containerName="mariadb-database-create" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.242932 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fffe9def-a18d-4a98-bb4c-f47faeb2fae8" containerName="mariadb-database-create" Oct 07 08:35:52 crc kubenswrapper[5025]: E1007 08:35:52.243016 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72ad49a-646a-40cf-a5bb-7bdecdf2153c" containerName="mariadb-database-create" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.243071 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72ad49a-646a-40cf-a5bb-7bdecdf2153c" containerName="mariadb-database-create" Oct 07 08:35:52 crc kubenswrapper[5025]: E1007 08:35:52.243147 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf250487-bcbd-4fc3-9d87-8e813ee7b39a" containerName="mariadb-database-create" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.243211 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf250487-bcbd-4fc3-9d87-8e813ee7b39a" containerName="mariadb-database-create" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.243425 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72ad49a-646a-40cf-a5bb-7bdecdf2153c" containerName="mariadb-database-create" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.243502 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf250487-bcbd-4fc3-9d87-8e813ee7b39a" containerName="mariadb-database-create" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.243593 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fffe9def-a18d-4a98-bb4c-f47faeb2fae8" containerName="mariadb-database-create" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.244198 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-696a-account-create-ht4mf" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.246438 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.250083 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-696a-account-create-ht4mf"] Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.341599 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjk6t\" (UniqueName: \"kubernetes.io/projected/25e6f419-f7bd-44d0-a377-488d155d7adc-kube-api-access-xjk6t\") pod \"nova-api-696a-account-create-ht4mf\" (UID: \"25e6f419-f7bd-44d0-a377-488d155d7adc\") " pod="openstack/nova-api-696a-account-create-ht4mf" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.442949 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjk6t\" (UniqueName: \"kubernetes.io/projected/25e6f419-f7bd-44d0-a377-488d155d7adc-kube-api-access-xjk6t\") pod \"nova-api-696a-account-create-ht4mf\" (UID: \"25e6f419-f7bd-44d0-a377-488d155d7adc\") " pod="openstack/nova-api-696a-account-create-ht4mf" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.447294 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7d54-account-create-5kmps"] Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.448451 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7d54-account-create-5kmps" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.451398 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.468372 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjk6t\" (UniqueName: \"kubernetes.io/projected/25e6f419-f7bd-44d0-a377-488d155d7adc-kube-api-access-xjk6t\") pod \"nova-api-696a-account-create-ht4mf\" (UID: \"25e6f419-f7bd-44d0-a377-488d155d7adc\") " pod="openstack/nova-api-696a-account-create-ht4mf" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.501605 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7d54-account-create-5kmps"] Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.561705 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-696a-account-create-ht4mf" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.648734 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4jzp\" (UniqueName: \"kubernetes.io/projected/c4e34060-f0fe-4d2e-8892-4530baf597b7-kube-api-access-r4jzp\") pod \"nova-cell0-7d54-account-create-5kmps\" (UID: \"c4e34060-f0fe-4d2e-8892-4530baf597b7\") " pod="openstack/nova-cell0-7d54-account-create-5kmps" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.658273 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-50ed-account-create-dhzfq"] Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.659434 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-50ed-account-create-dhzfq" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.669973 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-50ed-account-create-dhzfq"] Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.689403 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.751230 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4jzp\" (UniqueName: \"kubernetes.io/projected/c4e34060-f0fe-4d2e-8892-4530baf597b7-kube-api-access-r4jzp\") pod \"nova-cell0-7d54-account-create-5kmps\" (UID: \"c4e34060-f0fe-4d2e-8892-4530baf597b7\") " pod="openstack/nova-cell0-7d54-account-create-5kmps" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.793779 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4jzp\" (UniqueName: \"kubernetes.io/projected/c4e34060-f0fe-4d2e-8892-4530baf597b7-kube-api-access-r4jzp\") pod \"nova-cell0-7d54-account-create-5kmps\" (UID: \"c4e34060-f0fe-4d2e-8892-4530baf597b7\") " pod="openstack/nova-cell0-7d54-account-create-5kmps" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.853849 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmg4c\" (UniqueName: \"kubernetes.io/projected/e8aff795-d588-41a3-8921-44f29b96c5f1-kube-api-access-mmg4c\") pod \"nova-cell1-50ed-account-create-dhzfq\" (UID: \"e8aff795-d588-41a3-8921-44f29b96c5f1\") " pod="openstack/nova-cell1-50ed-account-create-dhzfq" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.938470 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-696a-account-create-ht4mf"] Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.955320 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmg4c\" (UniqueName: \"kubernetes.io/projected/e8aff795-d588-41a3-8921-44f29b96c5f1-kube-api-access-mmg4c\") pod \"nova-cell1-50ed-account-create-dhzfq\" (UID: \"e8aff795-d588-41a3-8921-44f29b96c5f1\") " pod="openstack/nova-cell1-50ed-account-create-dhzfq" Oct 07 08:35:52 crc kubenswrapper[5025]: I1007 08:35:52.976696 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmg4c\" (UniqueName: \"kubernetes.io/projected/e8aff795-d588-41a3-8921-44f29b96c5f1-kube-api-access-mmg4c\") pod \"nova-cell1-50ed-account-create-dhzfq\" (UID: \"e8aff795-d588-41a3-8921-44f29b96c5f1\") " pod="openstack/nova-cell1-50ed-account-create-dhzfq" Oct 07 08:35:53 crc kubenswrapper[5025]: I1007 08:35:53.002471 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-50ed-account-create-dhzfq" Oct 07 08:35:53 crc kubenswrapper[5025]: I1007 08:35:53.082295 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7d54-account-create-5kmps" Oct 07 08:35:53 crc kubenswrapper[5025]: I1007 08:35:53.295348 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-696a-account-create-ht4mf" event={"ID":"25e6f419-f7bd-44d0-a377-488d155d7adc","Type":"ContainerStarted","Data":"987fed2e2d33359b88022139a0f3601e1c2c261fcf53d6da4352c998b1235c43"} Oct 07 08:35:53 crc kubenswrapper[5025]: I1007 08:35:53.475858 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-50ed-account-create-dhzfq"] Oct 07 08:35:53 crc kubenswrapper[5025]: I1007 08:35:53.579466 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7d54-account-create-5kmps"] Oct 07 08:35:53 crc kubenswrapper[5025]: W1007 08:35:53.584895 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4e34060_f0fe_4d2e_8892_4530baf597b7.slice/crio-78bcfd7a034f0846f273ad095046ce1ede5d1a809ca7bf699b05e15ff1163f95 WatchSource:0}: Error finding container 78bcfd7a034f0846f273ad095046ce1ede5d1a809ca7bf699b05e15ff1163f95: Status 404 returned error can't find the container with id 78bcfd7a034f0846f273ad095046ce1ede5d1a809ca7bf699b05e15ff1163f95 Oct 07 08:35:54 crc kubenswrapper[5025]: I1007 08:35:54.306950 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7d54-account-create-5kmps" event={"ID":"c4e34060-f0fe-4d2e-8892-4530baf597b7","Type":"ContainerStarted","Data":"78bcfd7a034f0846f273ad095046ce1ede5d1a809ca7bf699b05e15ff1163f95"} Oct 07 08:35:54 crc kubenswrapper[5025]: I1007 08:35:54.308107 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-50ed-account-create-dhzfq" event={"ID":"e8aff795-d588-41a3-8921-44f29b96c5f1","Type":"ContainerStarted","Data":"9a5de0bc6f1527bba892309864058fd1c43817b1d9562c43b767b6c6cb7eb258"} Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.122435 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.297801 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-scripts\") pod \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.297919 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-run-httpd\") pod \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.297949 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-log-httpd\") pod \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.297995 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9x4g\" (UniqueName: \"kubernetes.io/projected/d2227c9b-c473-4ec5-8a72-4b7ac8565343-kube-api-access-c9x4g\") pod \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.298057 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-sg-core-conf-yaml\") pod \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.298163 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-config-data\") pod \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.298218 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-combined-ca-bundle\") pod \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\" (UID: \"d2227c9b-c473-4ec5-8a72-4b7ac8565343\") " Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.298445 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2227c9b-c473-4ec5-8a72-4b7ac8565343" (UID: "d2227c9b-c473-4ec5-8a72-4b7ac8565343"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.298597 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2227c9b-c473-4ec5-8a72-4b7ac8565343" (UID: "d2227c9b-c473-4ec5-8a72-4b7ac8565343"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.298941 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.298997 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2227c9b-c473-4ec5-8a72-4b7ac8565343-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.314144 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2227c9b-c473-4ec5-8a72-4b7ac8565343-kube-api-access-c9x4g" (OuterVolumeSpecName: "kube-api-access-c9x4g") pod "d2227c9b-c473-4ec5-8a72-4b7ac8565343" (UID: "d2227c9b-c473-4ec5-8a72-4b7ac8565343"). InnerVolumeSpecName "kube-api-access-c9x4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.316865 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-scripts" (OuterVolumeSpecName: "scripts") pod "d2227c9b-c473-4ec5-8a72-4b7ac8565343" (UID: "d2227c9b-c473-4ec5-8a72-4b7ac8565343"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.319418 5025 generic.go:334] "Generic (PLEG): container finished" podID="c4e34060-f0fe-4d2e-8892-4530baf597b7" containerID="c59ccdf0da5c1ef53c5b930c654f443ffdf7143d032306c99a4d08861433c863" exitCode=0 Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.319489 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7d54-account-create-5kmps" event={"ID":"c4e34060-f0fe-4d2e-8892-4530baf597b7","Type":"ContainerDied","Data":"c59ccdf0da5c1ef53c5b930c654f443ffdf7143d032306c99a4d08861433c863"} Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.323040 5025 generic.go:334] "Generic (PLEG): container finished" podID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerID="161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7" exitCode=0 Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.323101 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerDied","Data":"161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7"} Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.323125 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2227c9b-c473-4ec5-8a72-4b7ac8565343","Type":"ContainerDied","Data":"31733b73281bbf4675d74d07b95693be7450096d64b9ea778238f077c12e5354"} Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.323141 5025 scope.go:117] "RemoveContainer" containerID="592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.323296 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.325235 5025 generic.go:334] "Generic (PLEG): container finished" podID="e8aff795-d588-41a3-8921-44f29b96c5f1" containerID="da198bf061ed5643789e78bdd43ca4f1d884cca12b92765cf7109daf8ef661fa" exitCode=0 Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.325281 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-50ed-account-create-dhzfq" event={"ID":"e8aff795-d588-41a3-8921-44f29b96c5f1","Type":"ContainerDied","Data":"da198bf061ed5643789e78bdd43ca4f1d884cca12b92765cf7109daf8ef661fa"} Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.326695 5025 generic.go:334] "Generic (PLEG): container finished" podID="25e6f419-f7bd-44d0-a377-488d155d7adc" containerID="6ad308ba6fec9a024cb9472b49f339f44bc5aac28f293abde8c8577bb792d9aa" exitCode=0 Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.326828 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-696a-account-create-ht4mf" event={"ID":"25e6f419-f7bd-44d0-a377-488d155d7adc","Type":"ContainerDied","Data":"6ad308ba6fec9a024cb9472b49f339f44bc5aac28f293abde8c8577bb792d9aa"} Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.352180 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2227c9b-c473-4ec5-8a72-4b7ac8565343" (UID: "d2227c9b-c473-4ec5-8a72-4b7ac8565343"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.398456 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2227c9b-c473-4ec5-8a72-4b7ac8565343" (UID: "d2227c9b-c473-4ec5-8a72-4b7ac8565343"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.400446 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.400470 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9x4g\" (UniqueName: \"kubernetes.io/projected/d2227c9b-c473-4ec5-8a72-4b7ac8565343-kube-api-access-c9x4g\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.400480 5025 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.400489 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.418124 5025 scope.go:117] "RemoveContainer" containerID="f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.437641 5025 scope.go:117] "RemoveContainer" containerID="0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.445953 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-config-data" (OuterVolumeSpecName: "config-data") pod "d2227c9b-c473-4ec5-8a72-4b7ac8565343" (UID: "d2227c9b-c473-4ec5-8a72-4b7ac8565343"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.459435 5025 scope.go:117] "RemoveContainer" containerID="161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.480117 5025 scope.go:117] "RemoveContainer" containerID="592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba" Oct 07 08:35:55 crc kubenswrapper[5025]: E1007 08:35:55.480680 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba\": container with ID starting with 592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba not found: ID does not exist" containerID="592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.480723 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba"} err="failed to get container status \"592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba\": rpc error: code = NotFound desc = could not find container \"592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba\": container with ID starting with 592099704ca6c91efa50a4015aacc69ad2d39665607049cb093598ff11692aba not found: ID does not exist" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.480751 5025 scope.go:117] "RemoveContainer" containerID="f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82" Oct 07 08:35:55 crc kubenswrapper[5025]: E1007 08:35:55.481075 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82\": container with ID starting with f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82 not found: ID does not exist" containerID="f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.481095 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82"} err="failed to get container status \"f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82\": rpc error: code = NotFound desc = could not find container \"f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82\": container with ID starting with f8dc9e42cafacab892f08ac4e5b5f4287158b736aac9a42b6c61698b776e4e82 not found: ID does not exist" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.481108 5025 scope.go:117] "RemoveContainer" containerID="0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5" Oct 07 08:35:55 crc kubenswrapper[5025]: E1007 08:35:55.481348 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5\": container with ID starting with 0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5 not found: ID does not exist" containerID="0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.481361 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5"} err="failed to get container status \"0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5\": rpc error: code = NotFound desc = could not find container \"0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5\": container with ID starting with 0a983a3769aa7b400cb8ac81f8c014ad5bf07fdee8e67c576f7f037d633006f5 not found: ID does not exist" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.481375 5025 scope.go:117] "RemoveContainer" containerID="161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7" Oct 07 08:35:55 crc kubenswrapper[5025]: E1007 08:35:55.481594 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7\": container with ID starting with 161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7 not found: ID does not exist" containerID="161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.481613 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7"} err="failed to get container status \"161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7\": rpc error: code = NotFound desc = could not find container \"161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7\": container with ID starting with 161a262094678223fa581c27f35091528f0e19b75605ac841328f280aac7d7f7 not found: ID does not exist" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.502460 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2227c9b-c473-4ec5-8a72-4b7ac8565343-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.657148 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.670955 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.678823 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:55 crc kubenswrapper[5025]: E1007 08:35:55.679514 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="proxy-httpd" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.679532 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="proxy-httpd" Oct 07 08:35:55 crc kubenswrapper[5025]: E1007 08:35:55.679564 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="sg-core" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.679570 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="sg-core" Oct 07 08:35:55 crc kubenswrapper[5025]: E1007 08:35:55.679586 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="ceilometer-notification-agent" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.679592 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="ceilometer-notification-agent" Oct 07 08:35:55 crc kubenswrapper[5025]: E1007 08:35:55.679601 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="ceilometer-central-agent" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.679606 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="ceilometer-central-agent" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.679816 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="sg-core" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.679830 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="ceilometer-notification-agent" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.679837 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="proxy-httpd" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.679845 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" containerName="ceilometer-central-agent" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.681415 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.684142 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.684398 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.696732 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.809583 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.809659 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-scripts\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.809699 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.809857 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-run-httpd\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.809928 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-log-httpd\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.810016 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-config-data\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.810071 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9rmh\" (UniqueName: \"kubernetes.io/projected/ef936a4e-e7ee-453f-b43c-0809b4e889d8-kube-api-access-h9rmh\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.912096 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-scripts\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.912443 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.912570 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-run-httpd\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.912699 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-log-httpd\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.912831 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-config-data\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.912994 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9rmh\" (UniqueName: \"kubernetes.io/projected/ef936a4e-e7ee-453f-b43c-0809b4e889d8-kube-api-access-h9rmh\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.913174 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.913310 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-run-httpd\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.913816 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-log-httpd\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.915999 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-scripts\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.916389 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.923088 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.923735 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-config-data\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.931727 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2227c9b-c473-4ec5-8a72-4b7ac8565343" path="/var/lib/kubelet/pods/d2227c9b-c473-4ec5-8a72-4b7ac8565343/volumes" Oct 07 08:35:55 crc kubenswrapper[5025]: I1007 08:35:55.934936 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9rmh\" (UniqueName: \"kubernetes.io/projected/ef936a4e-e7ee-453f-b43c-0809b4e889d8-kube-api-access-h9rmh\") pod \"ceilometer-0\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " pod="openstack/ceilometer-0" Oct 07 08:35:56 crc kubenswrapper[5025]: I1007 08:35:56.002199 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:35:56 crc kubenswrapper[5025]: I1007 08:35:56.549720 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:35:56 crc kubenswrapper[5025]: I1007 08:35:56.758482 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-50ed-account-create-dhzfq" Oct 07 08:35:56 crc kubenswrapper[5025]: I1007 08:35:56.931659 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmg4c\" (UniqueName: \"kubernetes.io/projected/e8aff795-d588-41a3-8921-44f29b96c5f1-kube-api-access-mmg4c\") pod \"e8aff795-d588-41a3-8921-44f29b96c5f1\" (UID: \"e8aff795-d588-41a3-8921-44f29b96c5f1\") " Oct 07 08:35:56 crc kubenswrapper[5025]: I1007 08:35:56.937618 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8aff795-d588-41a3-8921-44f29b96c5f1-kube-api-access-mmg4c" (OuterVolumeSpecName: "kube-api-access-mmg4c") pod "e8aff795-d588-41a3-8921-44f29b96c5f1" (UID: "e8aff795-d588-41a3-8921-44f29b96c5f1"). InnerVolumeSpecName "kube-api-access-mmg4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.031152 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7d54-account-create-5kmps" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.034419 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmg4c\" (UniqueName: \"kubernetes.io/projected/e8aff795-d588-41a3-8921-44f29b96c5f1-kube-api-access-mmg4c\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.040017 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-696a-account-create-ht4mf" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.081325 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.135471 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4jzp\" (UniqueName: \"kubernetes.io/projected/c4e34060-f0fe-4d2e-8892-4530baf597b7-kube-api-access-r4jzp\") pod \"c4e34060-f0fe-4d2e-8892-4530baf597b7\" (UID: \"c4e34060-f0fe-4d2e-8892-4530baf597b7\") " Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.135757 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjk6t\" (UniqueName: \"kubernetes.io/projected/25e6f419-f7bd-44d0-a377-488d155d7adc-kube-api-access-xjk6t\") pod \"25e6f419-f7bd-44d0-a377-488d155d7adc\" (UID: \"25e6f419-f7bd-44d0-a377-488d155d7adc\") " Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.140463 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e6f419-f7bd-44d0-a377-488d155d7adc-kube-api-access-xjk6t" (OuterVolumeSpecName: "kube-api-access-xjk6t") pod "25e6f419-f7bd-44d0-a377-488d155d7adc" (UID: "25e6f419-f7bd-44d0-a377-488d155d7adc"). InnerVolumeSpecName "kube-api-access-xjk6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.140753 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e34060-f0fe-4d2e-8892-4530baf597b7-kube-api-access-r4jzp" (OuterVolumeSpecName: "kube-api-access-r4jzp") pod "c4e34060-f0fe-4d2e-8892-4530baf597b7" (UID: "c4e34060-f0fe-4d2e-8892-4530baf597b7"). InnerVolumeSpecName "kube-api-access-r4jzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.238073 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjk6t\" (UniqueName: \"kubernetes.io/projected/25e6f419-f7bd-44d0-a377-488d155d7adc-kube-api-access-xjk6t\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.238120 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4jzp\" (UniqueName: \"kubernetes.io/projected/c4e34060-f0fe-4d2e-8892-4530baf597b7-kube-api-access-r4jzp\") on node \"crc\" DevicePath \"\"" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.346104 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-696a-account-create-ht4mf" event={"ID":"25e6f419-f7bd-44d0-a377-488d155d7adc","Type":"ContainerDied","Data":"987fed2e2d33359b88022139a0f3601e1c2c261fcf53d6da4352c998b1235c43"} Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.346128 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-696a-account-create-ht4mf" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.346143 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="987fed2e2d33359b88022139a0f3601e1c2c261fcf53d6da4352c998b1235c43" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.347864 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7d54-account-create-5kmps" event={"ID":"c4e34060-f0fe-4d2e-8892-4530baf597b7","Type":"ContainerDied","Data":"78bcfd7a034f0846f273ad095046ce1ede5d1a809ca7bf699b05e15ff1163f95"} Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.347960 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78bcfd7a034f0846f273ad095046ce1ede5d1a809ca7bf699b05e15ff1163f95" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.348065 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7d54-account-create-5kmps" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.358087 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerStarted","Data":"aa8b92792d06d854e5ad3e17a61b2acebee4a79fe84fa24ddeb3e08e4942fe2d"} Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.360038 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-50ed-account-create-dhzfq" event={"ID":"e8aff795-d588-41a3-8921-44f29b96c5f1","Type":"ContainerDied","Data":"9a5de0bc6f1527bba892309864058fd1c43817b1d9562c43b767b6c6cb7eb258"} Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.360145 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a5de0bc6f1527bba892309864058fd1c43817b1d9562c43b767b6c6cb7eb258" Oct 07 08:35:57 crc kubenswrapper[5025]: I1007 08:35:57.360099 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-50ed-account-create-dhzfq" Oct 07 08:35:58 crc kubenswrapper[5025]: I1007 08:35:58.405798 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerStarted","Data":"5d885794cd353d25817c153ec224528afacb93bec436c9aebd5976c6a919b5c4"} Oct 07 08:35:59 crc kubenswrapper[5025]: I1007 08:35:59.424033 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerStarted","Data":"8d03535c72626f533466d86a3013928ce253532be1772838636f7cd97a82038c"} Oct 07 08:36:00 crc kubenswrapper[5025]: I1007 08:36:00.433568 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerStarted","Data":"c00cdd17244bcb271bcc03db5c452c6fc1f7a1f1ac3a08a0a7e990f3fa6a8b55"} Oct 07 08:36:01 crc kubenswrapper[5025]: I1007 08:36:01.433637 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:01 crc kubenswrapper[5025]: I1007 08:36:01.445985 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerStarted","Data":"8f07a70f91537e7c942bb5620dcd056cc533260d0d215e66f45458c7b3be084e"} Oct 07 08:36:01 crc kubenswrapper[5025]: I1007 08:36:01.446334 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 08:36:01 crc kubenswrapper[5025]: I1007 08:36:01.473171 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.051395749 podStartE2EDuration="6.473146531s" podCreationTimestamp="2025-10-07 08:35:55 +0000 UTC" firstStartedPulling="2025-10-07 08:35:56.572380407 +0000 UTC m=+1163.381694551" lastFinishedPulling="2025-10-07 08:36:00.994131189 +0000 UTC m=+1167.803445333" observedRunningTime="2025-10-07 08:36:01.465765899 +0000 UTC m=+1168.275080043" watchObservedRunningTime="2025-10-07 08:36:01.473146531 +0000 UTC m=+1168.282460675" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.475119 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="ceilometer-central-agent" containerID="cri-o://5d885794cd353d25817c153ec224528afacb93bec436c9aebd5976c6a919b5c4" gracePeriod=30 Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.475591 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="proxy-httpd" containerID="cri-o://8f07a70f91537e7c942bb5620dcd056cc533260d0d215e66f45458c7b3be084e" gracePeriod=30 Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.475639 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="sg-core" containerID="cri-o://c00cdd17244bcb271bcc03db5c452c6fc1f7a1f1ac3a08a0a7e990f3fa6a8b55" gracePeriod=30 Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.475674 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="ceilometer-notification-agent" containerID="cri-o://8d03535c72626f533466d86a3013928ce253532be1772838636f7cd97a82038c" gracePeriod=30 Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.655494 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pw82v"] Oct 07 08:36:02 crc kubenswrapper[5025]: E1007 08:36:02.656139 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e34060-f0fe-4d2e-8892-4530baf597b7" containerName="mariadb-account-create" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.656157 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e34060-f0fe-4d2e-8892-4530baf597b7" containerName="mariadb-account-create" Oct 07 08:36:02 crc kubenswrapper[5025]: E1007 08:36:02.656171 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e6f419-f7bd-44d0-a377-488d155d7adc" containerName="mariadb-account-create" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.656179 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e6f419-f7bd-44d0-a377-488d155d7adc" containerName="mariadb-account-create" Oct 07 08:36:02 crc kubenswrapper[5025]: E1007 08:36:02.656198 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8aff795-d588-41a3-8921-44f29b96c5f1" containerName="mariadb-account-create" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.656203 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8aff795-d588-41a3-8921-44f29b96c5f1" containerName="mariadb-account-create" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.656496 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e34060-f0fe-4d2e-8892-4530baf597b7" containerName="mariadb-account-create" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.656513 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e6f419-f7bd-44d0-a377-488d155d7adc" containerName="mariadb-account-create" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.656529 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8aff795-d588-41a3-8921-44f29b96c5f1" containerName="mariadb-account-create" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.657136 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.660136 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.660240 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.660369 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v44f9" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.668524 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pw82v"] Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.732731 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-config-data\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.732855 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkl5l\" (UniqueName: \"kubernetes.io/projected/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-kube-api-access-pkl5l\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.733028 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-scripts\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.733104 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.835196 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-scripts\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.835257 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.835345 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-config-data\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.835366 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkl5l\" (UniqueName: \"kubernetes.io/projected/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-kube-api-access-pkl5l\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.841663 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.841833 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-scripts\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.846101 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-config-data\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.852795 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkl5l\" (UniqueName: \"kubernetes.io/projected/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-kube-api-access-pkl5l\") pod \"nova-cell0-conductor-db-sync-pw82v\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:02 crc kubenswrapper[5025]: I1007 08:36:02.978515 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.480449 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pw82v"] Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.489423 5025 generic.go:334] "Generic (PLEG): container finished" podID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerID="8f07a70f91537e7c942bb5620dcd056cc533260d0d215e66f45458c7b3be084e" exitCode=0 Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.489448 5025 generic.go:334] "Generic (PLEG): container finished" podID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerID="c00cdd17244bcb271bcc03db5c452c6fc1f7a1f1ac3a08a0a7e990f3fa6a8b55" exitCode=2 Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.489459 5025 generic.go:334] "Generic (PLEG): container finished" podID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerID="8d03535c72626f533466d86a3013928ce253532be1772838636f7cd97a82038c" exitCode=0 Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.489468 5025 generic.go:334] "Generic (PLEG): container finished" podID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerID="5d885794cd353d25817c153ec224528afacb93bec436c9aebd5976c6a919b5c4" exitCode=0 Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.489491 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerDied","Data":"8f07a70f91537e7c942bb5620dcd056cc533260d0d215e66f45458c7b3be084e"} Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.489520 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerDied","Data":"c00cdd17244bcb271bcc03db5c452c6fc1f7a1f1ac3a08a0a7e990f3fa6a8b55"} Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.489532 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerDied","Data":"8d03535c72626f533466d86a3013928ce253532be1772838636f7cd97a82038c"} Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.489558 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerDied","Data":"5d885794cd353d25817c153ec224528afacb93bec436c9aebd5976c6a919b5c4"} Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.498018 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.650143 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9rmh\" (UniqueName: \"kubernetes.io/projected/ef936a4e-e7ee-453f-b43c-0809b4e889d8-kube-api-access-h9rmh\") pod \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.650200 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-config-data\") pod \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.650266 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-combined-ca-bundle\") pod \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.650351 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-log-httpd\") pod \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.650383 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-scripts\") pod \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.650409 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-run-httpd\") pod \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.650496 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-sg-core-conf-yaml\") pod \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\" (UID: \"ef936a4e-e7ee-453f-b43c-0809b4e889d8\") " Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.651053 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ef936a4e-e7ee-453f-b43c-0809b4e889d8" (UID: "ef936a4e-e7ee-453f-b43c-0809b4e889d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.651164 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ef936a4e-e7ee-453f-b43c-0809b4e889d8" (UID: "ef936a4e-e7ee-453f-b43c-0809b4e889d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.651610 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.651630 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef936a4e-e7ee-453f-b43c-0809b4e889d8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.659807 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef936a4e-e7ee-453f-b43c-0809b4e889d8-kube-api-access-h9rmh" (OuterVolumeSpecName: "kube-api-access-h9rmh") pod "ef936a4e-e7ee-453f-b43c-0809b4e889d8" (UID: "ef936a4e-e7ee-453f-b43c-0809b4e889d8"). InnerVolumeSpecName "kube-api-access-h9rmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.671982 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-scripts" (OuterVolumeSpecName: "scripts") pod "ef936a4e-e7ee-453f-b43c-0809b4e889d8" (UID: "ef936a4e-e7ee-453f-b43c-0809b4e889d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.689690 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ef936a4e-e7ee-453f-b43c-0809b4e889d8" (UID: "ef936a4e-e7ee-453f-b43c-0809b4e889d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.726696 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef936a4e-e7ee-453f-b43c-0809b4e889d8" (UID: "ef936a4e-e7ee-453f-b43c-0809b4e889d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.757671 5025 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.757896 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9rmh\" (UniqueName: \"kubernetes.io/projected/ef936a4e-e7ee-453f-b43c-0809b4e889d8-kube-api-access-h9rmh\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.757970 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.758038 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.822665 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-config-data" (OuterVolumeSpecName: "config-data") pod "ef936a4e-e7ee-453f-b43c-0809b4e889d8" (UID: "ef936a4e-e7ee-453f-b43c-0809b4e889d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:03 crc kubenswrapper[5025]: I1007 08:36:03.860627 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef936a4e-e7ee-453f-b43c-0809b4e889d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.064888 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.065133 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerName="glance-log" containerID="cri-o://381e8f7be4247a42fa5605b246ada82812656fd0901e3a380d230417aae73fd3" gracePeriod=30 Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.065181 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerName="glance-httpd" containerID="cri-o://ac9190177a046f25b19b1e2b7c4d077ec8a723e5d8034bb239d27ef5b5cdbc8f" gracePeriod=30 Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.504686 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pw82v" event={"ID":"a910c5fd-a686-4b9f-b1bd-6cdcf4641013","Type":"ContainerStarted","Data":"419ac94e7f7340b9288d5527e55ba06bcbf1cb3d500a77026ce5a9f1bd807878"} Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.508231 5025 generic.go:334] "Generic (PLEG): container finished" podID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerID="381e8f7be4247a42fa5605b246ada82812656fd0901e3a380d230417aae73fd3" exitCode=143 Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.508315 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de05e1a-7b03-4e41-9c47-4a2d135b3879","Type":"ContainerDied","Data":"381e8f7be4247a42fa5605b246ada82812656fd0901e3a380d230417aae73fd3"} Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.515311 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef936a4e-e7ee-453f-b43c-0809b4e889d8","Type":"ContainerDied","Data":"aa8b92792d06d854e5ad3e17a61b2acebee4a79fe84fa24ddeb3e08e4942fe2d"} Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.515369 5025 scope.go:117] "RemoveContainer" containerID="8f07a70f91537e7c942bb5620dcd056cc533260d0d215e66f45458c7b3be084e" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.515497 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.548466 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.563332 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.565086 5025 scope.go:117] "RemoveContainer" containerID="c00cdd17244bcb271bcc03db5c452c6fc1f7a1f1ac3a08a0a7e990f3fa6a8b55" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.578270 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:04 crc kubenswrapper[5025]: E1007 08:36:04.578746 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="ceilometer-notification-agent" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.578761 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="ceilometer-notification-agent" Oct 07 08:36:04 crc kubenswrapper[5025]: E1007 08:36:04.578777 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="proxy-httpd" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.578784 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="proxy-httpd" Oct 07 08:36:04 crc kubenswrapper[5025]: E1007 08:36:04.578813 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="sg-core" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.578821 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="sg-core" Oct 07 08:36:04 crc kubenswrapper[5025]: E1007 08:36:04.578839 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="ceilometer-central-agent" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.578846 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="ceilometer-central-agent" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.579146 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="ceilometer-notification-agent" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.579168 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="sg-core" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.579189 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="ceilometer-central-agent" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.579205 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" containerName="proxy-httpd" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.581202 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.583373 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.583735 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.590709 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.595231 5025 scope.go:117] "RemoveContainer" containerID="8d03535c72626f533466d86a3013928ce253532be1772838636f7cd97a82038c" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.665983 5025 scope.go:117] "RemoveContainer" containerID="5d885794cd353d25817c153ec224528afacb93bec436c9aebd5976c6a919b5c4" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.672832 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.672884 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-log-httpd\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.672911 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-run-httpd\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.672951 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvtbj\" (UniqueName: \"kubernetes.io/projected/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-kube-api-access-nvtbj\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.673338 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.673402 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-scripts\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.673431 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-config-data\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.776697 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-scripts\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.776741 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-config-data\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.776810 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.776831 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-log-httpd\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.776852 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-run-httpd\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.776891 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvtbj\" (UniqueName: \"kubernetes.io/projected/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-kube-api-access-nvtbj\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.777001 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.778955 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-run-httpd\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.779155 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-log-httpd\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.785195 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.786977 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.787328 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-config-data\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.787386 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-scripts\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.799893 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvtbj\" (UniqueName: \"kubernetes.io/projected/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-kube-api-access-nvtbj\") pod \"ceilometer-0\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " pod="openstack/ceilometer-0" Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.816695 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.816969 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerName="glance-log" containerID="cri-o://7b804e5864cae65ced24516a277cd7996e7bedb1914c6e4624a7ffbd3aedf8ad" gracePeriod=30 Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.817130 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerName="glance-httpd" containerID="cri-o://bbb4e293d104b5e24464db46510950d8cbda476949190331ff3bb796a28a78b7" gracePeriod=30 Oct 07 08:36:04 crc kubenswrapper[5025]: I1007 08:36:04.943673 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:05 crc kubenswrapper[5025]: I1007 08:36:05.510849 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:05 crc kubenswrapper[5025]: I1007 08:36:05.531650 5025 generic.go:334] "Generic (PLEG): container finished" podID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerID="7b804e5864cae65ced24516a277cd7996e7bedb1914c6e4624a7ffbd3aedf8ad" exitCode=143 Oct 07 08:36:05 crc kubenswrapper[5025]: I1007 08:36:05.531749 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674fb52e-75c9-4ee6-962b-7eec21242d05","Type":"ContainerDied","Data":"7b804e5864cae65ced24516a277cd7996e7bedb1914c6e4624a7ffbd3aedf8ad"} Oct 07 08:36:05 crc kubenswrapper[5025]: I1007 08:36:05.553196 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:05 crc kubenswrapper[5025]: I1007 08:36:05.928880 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef936a4e-e7ee-453f-b43c-0809b4e889d8" path="/var/lib/kubelet/pods/ef936a4e-e7ee-453f-b43c-0809b4e889d8/volumes" Oct 07 08:36:06 crc kubenswrapper[5025]: I1007 08:36:06.558553 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerStarted","Data":"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9"} Oct 07 08:36:06 crc kubenswrapper[5025]: I1007 08:36:06.558896 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerStarted","Data":"ab6f76d9b3c4be04de2979427cc2130c28aeeecaac4063790b5f6ffb9838ebdb"} Oct 07 08:36:07 crc kubenswrapper[5025]: I1007 08:36:07.571940 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerStarted","Data":"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa"} Oct 07 08:36:07 crc kubenswrapper[5025]: I1007 08:36:07.574193 5025 generic.go:334] "Generic (PLEG): container finished" podID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerID="ac9190177a046f25b19b1e2b7c4d077ec8a723e5d8034bb239d27ef5b5cdbc8f" exitCode=0 Oct 07 08:36:07 crc kubenswrapper[5025]: I1007 08:36:07.574240 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de05e1a-7b03-4e41-9c47-4a2d135b3879","Type":"ContainerDied","Data":"ac9190177a046f25b19b1e2b7c4d077ec8a723e5d8034bb239d27ef5b5cdbc8f"} Oct 07 08:36:08 crc kubenswrapper[5025]: I1007 08:36:08.591220 5025 generic.go:334] "Generic (PLEG): container finished" podID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerID="bbb4e293d104b5e24464db46510950d8cbda476949190331ff3bb796a28a78b7" exitCode=0 Oct 07 08:36:08 crc kubenswrapper[5025]: I1007 08:36:08.591270 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674fb52e-75c9-4ee6-962b-7eec21242d05","Type":"ContainerDied","Data":"bbb4e293d104b5e24464db46510950d8cbda476949190331ff3bb796a28a78b7"} Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.252746 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.331304 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-public-tls-certs\") pod \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.332101 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-logs\") pod \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.332137 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-scripts\") pod \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.332213 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.332681 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-logs" (OuterVolumeSpecName: "logs") pod "3de05e1a-7b03-4e41-9c47-4a2d135b3879" (UID: "3de05e1a-7b03-4e41-9c47-4a2d135b3879"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.333351 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-httpd-run\") pod \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.333884 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-combined-ca-bundle\") pod \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.333944 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sgzp\" (UniqueName: \"kubernetes.io/projected/3de05e1a-7b03-4e41-9c47-4a2d135b3879-kube-api-access-9sgzp\") pod \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.334371 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3de05e1a-7b03-4e41-9c47-4a2d135b3879" (UID: "3de05e1a-7b03-4e41-9c47-4a2d135b3879"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.334441 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-config-data\") pod \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\" (UID: \"3de05e1a-7b03-4e41-9c47-4a2d135b3879\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.339944 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.339969 5025 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de05e1a-7b03-4e41-9c47-4a2d135b3879-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.342924 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de05e1a-7b03-4e41-9c47-4a2d135b3879-kube-api-access-9sgzp" (OuterVolumeSpecName: "kube-api-access-9sgzp") pod "3de05e1a-7b03-4e41-9c47-4a2d135b3879" (UID: "3de05e1a-7b03-4e41-9c47-4a2d135b3879"). InnerVolumeSpecName "kube-api-access-9sgzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.343458 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-scripts" (OuterVolumeSpecName: "scripts") pod "3de05e1a-7b03-4e41-9c47-4a2d135b3879" (UID: "3de05e1a-7b03-4e41-9c47-4a2d135b3879"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.344761 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "3de05e1a-7b03-4e41-9c47-4a2d135b3879" (UID: "3de05e1a-7b03-4e41-9c47-4a2d135b3879"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.382926 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3de05e1a-7b03-4e41-9c47-4a2d135b3879" (UID: "3de05e1a-7b03-4e41-9c47-4a2d135b3879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.429553 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-config-data" (OuterVolumeSpecName: "config-data") pod "3de05e1a-7b03-4e41-9c47-4a2d135b3879" (UID: "3de05e1a-7b03-4e41-9c47-4a2d135b3879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.433518 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3de05e1a-7b03-4e41-9c47-4a2d135b3879" (UID: "3de05e1a-7b03-4e41-9c47-4a2d135b3879"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.441460 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.441484 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.441496 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.441517 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.441527 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de05e1a-7b03-4e41-9c47-4a2d135b3879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.441537 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sgzp\" (UniqueName: \"kubernetes.io/projected/3de05e1a-7b03-4e41-9c47-4a2d135b3879-kube-api-access-9sgzp\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.481712 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.542817 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.573114 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.632003 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerStarted","Data":"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1"} Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.633921 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pw82v" event={"ID":"a910c5fd-a686-4b9f-b1bd-6cdcf4641013","Type":"ContainerStarted","Data":"1658af060071457045658f1a0669689a5e3ecc8ca296630d6d86cf21a5155bc2"} Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.637156 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de05e1a-7b03-4e41-9c47-4a2d135b3879","Type":"ContainerDied","Data":"42c59bf8b6aa1fce84b828935f1a920b58f292ab778f28b80de66f8efb513604"} Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.637200 5025 scope.go:117] "RemoveContainer" containerID="ac9190177a046f25b19b1e2b7c4d077ec8a723e5d8034bb239d27ef5b5cdbc8f" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.637303 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.641963 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674fb52e-75c9-4ee6-962b-7eec21242d05","Type":"ContainerDied","Data":"c2b1daced2f6f119257331765141a3eafb847d7915d6d3fe42d89ac0e7215a84"} Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.642042 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.643849 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-config-data\") pod \"674fb52e-75c9-4ee6-962b-7eec21242d05\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.643887 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-scripts\") pod \"674fb52e-75c9-4ee6-962b-7eec21242d05\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.643964 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-httpd-run\") pod \"674fb52e-75c9-4ee6-962b-7eec21242d05\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.644037 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"674fb52e-75c9-4ee6-962b-7eec21242d05\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.644059 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-logs\") pod \"674fb52e-75c9-4ee6-962b-7eec21242d05\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.644091 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-combined-ca-bundle\") pod \"674fb52e-75c9-4ee6-962b-7eec21242d05\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.644153 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-internal-tls-certs\") pod \"674fb52e-75c9-4ee6-962b-7eec21242d05\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.644187 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgk2g\" (UniqueName: \"kubernetes.io/projected/674fb52e-75c9-4ee6-962b-7eec21242d05-kube-api-access-rgk2g\") pod \"674fb52e-75c9-4ee6-962b-7eec21242d05\" (UID: \"674fb52e-75c9-4ee6-962b-7eec21242d05\") " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.644618 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-logs" (OuterVolumeSpecName: "logs") pod "674fb52e-75c9-4ee6-962b-7eec21242d05" (UID: "674fb52e-75c9-4ee6-962b-7eec21242d05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.644813 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "674fb52e-75c9-4ee6-962b-7eec21242d05" (UID: "674fb52e-75c9-4ee6-962b-7eec21242d05"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.650175 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-scripts" (OuterVolumeSpecName: "scripts") pod "674fb52e-75c9-4ee6-962b-7eec21242d05" (UID: "674fb52e-75c9-4ee6-962b-7eec21242d05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.651415 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674fb52e-75c9-4ee6-962b-7eec21242d05-kube-api-access-rgk2g" (OuterVolumeSpecName: "kube-api-access-rgk2g") pod "674fb52e-75c9-4ee6-962b-7eec21242d05" (UID: "674fb52e-75c9-4ee6-962b-7eec21242d05"). InnerVolumeSpecName "kube-api-access-rgk2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.656764 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "674fb52e-75c9-4ee6-962b-7eec21242d05" (UID: "674fb52e-75c9-4ee6-962b-7eec21242d05"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.678756 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-pw82v" podStartSLOduration=1.860912729 podStartE2EDuration="10.678737933s" podCreationTimestamp="2025-10-07 08:36:02 +0000 UTC" firstStartedPulling="2025-10-07 08:36:03.485531513 +0000 UTC m=+1170.294845657" lastFinishedPulling="2025-10-07 08:36:12.303356717 +0000 UTC m=+1179.112670861" observedRunningTime="2025-10-07 08:36:12.661229821 +0000 UTC m=+1179.470543985" watchObservedRunningTime="2025-10-07 08:36:12.678737933 +0000 UTC m=+1179.488052077" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.686057 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "674fb52e-75c9-4ee6-962b-7eec21242d05" (UID: "674fb52e-75c9-4ee6-962b-7eec21242d05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.705656 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-config-data" (OuterVolumeSpecName: "config-data") pod "674fb52e-75c9-4ee6-962b-7eec21242d05" (UID: "674fb52e-75c9-4ee6-962b-7eec21242d05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.735264 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "674fb52e-75c9-4ee6-962b-7eec21242d05" (UID: "674fb52e-75c9-4ee6-962b-7eec21242d05"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.745964 5025 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.746241 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.746386 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674fb52e-75c9-4ee6-962b-7eec21242d05-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.746474 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.746576 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.746667 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgk2g\" (UniqueName: \"kubernetes.io/projected/674fb52e-75c9-4ee6-962b-7eec21242d05-kube-api-access-rgk2g\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.746759 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.746836 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674fb52e-75c9-4ee6-962b-7eec21242d05-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.772311 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.835433 5025 scope.go:117] "RemoveContainer" containerID="381e8f7be4247a42fa5605b246ada82812656fd0901e3a380d230417aae73fd3" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.845698 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.851330 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.859236 5025 scope.go:117] "RemoveContainer" containerID="bbb4e293d104b5e24464db46510950d8cbda476949190331ff3bb796a28a78b7" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.859842 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.882936 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:36:12 crc kubenswrapper[5025]: E1007 08:36:12.883890 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerName="glance-httpd" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.883914 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerName="glance-httpd" Oct 07 08:36:12 crc kubenswrapper[5025]: E1007 08:36:12.883965 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerName="glance-log" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.883975 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerName="glance-log" Oct 07 08:36:12 crc kubenswrapper[5025]: E1007 08:36:12.884007 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerName="glance-log" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.884016 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerName="glance-log" Oct 07 08:36:12 crc kubenswrapper[5025]: E1007 08:36:12.884071 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerName="glance-httpd" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.884081 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerName="glance-httpd" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.884602 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerName="glance-log" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.884666 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerName="glance-log" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.884693 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="674fb52e-75c9-4ee6-962b-7eec21242d05" containerName="glance-httpd" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.884708 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" containerName="glance-httpd" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.886904 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.890580 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.892778 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.896133 5025 scope.go:117] "RemoveContainer" containerID="7b804e5864cae65ced24516a277cd7996e7bedb1914c6e4624a7ffbd3aedf8ad" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.897019 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.952988 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-logs\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.953136 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfpm\" (UniqueName: \"kubernetes.io/projected/39a4256e-db59-4623-aefb-bad0c1412bf1-kube-api-access-9dfpm\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.953162 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.953198 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-scripts\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.953432 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.953527 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.953647 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-config-data\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.953742 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:12 crc kubenswrapper[5025]: I1007 08:36:12.998674 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.004506 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.016070 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.022009 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.027690 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.028344 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.028506 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.056880 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.056953 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.057020 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-config-data\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.057075 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.057093 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.058037 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-logs\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.058494 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfpm\" (UniqueName: \"kubernetes.io/projected/39a4256e-db59-4623-aefb-bad0c1412bf1-kube-api-access-9dfpm\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.058606 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.058657 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-scripts\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.061855 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.062326 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.062530 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-logs\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.065479 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.066557 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-config-data\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.073257 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-scripts\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.077777 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfpm\" (UniqueName: \"kubernetes.io/projected/39a4256e-db59-4623-aefb-bad0c1412bf1-kube-api-access-9dfpm\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.095510 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.161008 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.161077 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.161135 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.161171 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.161191 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.161218 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.161243 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prpb9\" (UniqueName: \"kubernetes.io/projected/988f0f87-7a01-4aac-a874-8e885b2f83cb-kube-api-access-prpb9\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.161270 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.209136 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.263908 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.264047 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.264914 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.265031 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.265085 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.265181 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.265222 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.265263 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prpb9\" (UniqueName: \"kubernetes.io/projected/988f0f87-7a01-4aac-a874-8e885b2f83cb-kube-api-access-prpb9\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.266185 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.266326 5025 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.266508 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.269684 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.270472 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.272790 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.278615 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.289807 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prpb9\" (UniqueName: \"kubernetes.io/projected/988f0f87-7a01-4aac-a874-8e885b2f83cb-kube-api-access-prpb9\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.328207 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.350299 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.792417 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:36:13 crc kubenswrapper[5025]: W1007 08:36:13.805290 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39a4256e_db59_4623_aefb_bad0c1412bf1.slice/crio-9278e0622273f09878e7b42f19da50e6c31c60d114fabc474824e31ce4834620 WatchSource:0}: Error finding container 9278e0622273f09878e7b42f19da50e6c31c60d114fabc474824e31ce4834620: Status 404 returned error can't find the container with id 9278e0622273f09878e7b42f19da50e6c31c60d114fabc474824e31ce4834620 Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.912634 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.930902 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de05e1a-7b03-4e41-9c47-4a2d135b3879" path="/var/lib/kubelet/pods/3de05e1a-7b03-4e41-9c47-4a2d135b3879/volumes" Oct 07 08:36:13 crc kubenswrapper[5025]: I1007 08:36:13.931526 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674fb52e-75c9-4ee6-962b-7eec21242d05" path="/var/lib/kubelet/pods/674fb52e-75c9-4ee6-962b-7eec21242d05/volumes" Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.682514 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerStarted","Data":"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869"} Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.682630 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="ceilometer-central-agent" containerID="cri-o://1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9" gracePeriod=30 Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.682911 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.682969 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="proxy-httpd" containerID="cri-o://cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869" gracePeriod=30 Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.683084 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="ceilometer-notification-agent" containerID="cri-o://b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa" gracePeriod=30 Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.683134 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="sg-core" containerID="cri-o://0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1" gracePeriod=30 Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.685965 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"988f0f87-7a01-4aac-a874-8e885b2f83cb","Type":"ContainerStarted","Data":"6679ef0ed68a662d16557ba47e79ca743311dae66548a6f30fab1c7702863417"} Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.686002 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"988f0f87-7a01-4aac-a874-8e885b2f83cb","Type":"ContainerStarted","Data":"8c45ebf433ae81889e822f3639080aabf9b0325d083969c30181a29c8c99a405"} Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.689168 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39a4256e-db59-4623-aefb-bad0c1412bf1","Type":"ContainerStarted","Data":"577b784734dd3e48c70fc10ff3124e3ae64cd95fd597512882e35ecf5b2ba94f"} Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.689226 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39a4256e-db59-4623-aefb-bad0c1412bf1","Type":"ContainerStarted","Data":"9278e0622273f09878e7b42f19da50e6c31c60d114fabc474824e31ce4834620"} Oct 07 08:36:14 crc kubenswrapper[5025]: I1007 08:36:14.708363 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.605880823 podStartE2EDuration="10.708346378s" podCreationTimestamp="2025-10-07 08:36:04 +0000 UTC" firstStartedPulling="2025-10-07 08:36:05.527671123 +0000 UTC m=+1172.336985267" lastFinishedPulling="2025-10-07 08:36:13.630136678 +0000 UTC m=+1180.439450822" observedRunningTime="2025-10-07 08:36:14.702443552 +0000 UTC m=+1181.511757696" watchObservedRunningTime="2025-10-07 08:36:14.708346378 +0000 UTC m=+1181.517660522" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.412322 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.515222 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-log-httpd\") pod \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.515309 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-config-data\") pod \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.515349 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-scripts\") pod \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.515388 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-run-httpd\") pod \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.515414 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvtbj\" (UniqueName: \"kubernetes.io/projected/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-kube-api-access-nvtbj\") pod \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.515460 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-sg-core-conf-yaml\") pod \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.515512 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-combined-ca-bundle\") pod \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\" (UID: \"d2290c61-7eaf-4296-9b09-6ba10e01e7b3\") " Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.515810 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2290c61-7eaf-4296-9b09-6ba10e01e7b3" (UID: "d2290c61-7eaf-4296-9b09-6ba10e01e7b3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.515991 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.516573 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2290c61-7eaf-4296-9b09-6ba10e01e7b3" (UID: "d2290c61-7eaf-4296-9b09-6ba10e01e7b3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.521089 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-scripts" (OuterVolumeSpecName: "scripts") pod "d2290c61-7eaf-4296-9b09-6ba10e01e7b3" (UID: "d2290c61-7eaf-4296-9b09-6ba10e01e7b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.521962 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-kube-api-access-nvtbj" (OuterVolumeSpecName: "kube-api-access-nvtbj") pod "d2290c61-7eaf-4296-9b09-6ba10e01e7b3" (UID: "d2290c61-7eaf-4296-9b09-6ba10e01e7b3"). InnerVolumeSpecName "kube-api-access-nvtbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.547315 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2290c61-7eaf-4296-9b09-6ba10e01e7b3" (UID: "d2290c61-7eaf-4296-9b09-6ba10e01e7b3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.579045 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2290c61-7eaf-4296-9b09-6ba10e01e7b3" (UID: "d2290c61-7eaf-4296-9b09-6ba10e01e7b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.617792 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.617829 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.617843 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvtbj\" (UniqueName: \"kubernetes.io/projected/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-kube-api-access-nvtbj\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.617854 5025 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.617868 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.629613 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-config-data" (OuterVolumeSpecName: "config-data") pod "d2290c61-7eaf-4296-9b09-6ba10e01e7b3" (UID: "d2290c61-7eaf-4296-9b09-6ba10e01e7b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.703440 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"988f0f87-7a01-4aac-a874-8e885b2f83cb","Type":"ContainerStarted","Data":"4055606214f6b158a47c0d68c52aea2b1ad07643ccaa8ad47dd496cc1fca1a1e"} Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.705861 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39a4256e-db59-4623-aefb-bad0c1412bf1","Type":"ContainerStarted","Data":"40bc944b02028fd2ab780bbc8f7785ef5771a301ce586feafccde56e9109c707"} Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719002 5025 generic.go:334] "Generic (PLEG): container finished" podID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerID="cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869" exitCode=0 Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719068 5025 generic.go:334] "Generic (PLEG): container finished" podID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerID="0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1" exitCode=2 Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719087 5025 generic.go:334] "Generic (PLEG): container finished" podID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerID="b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa" exitCode=0 Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719106 5025 generic.go:334] "Generic (PLEG): container finished" podID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerID="1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9" exitCode=0 Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719148 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerDied","Data":"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869"} Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719200 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerDied","Data":"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1"} Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719225 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerDied","Data":"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa"} Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719248 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerDied","Data":"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9"} Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719270 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2290c61-7eaf-4296-9b09-6ba10e01e7b3","Type":"ContainerDied","Data":"ab6f76d9b3c4be04de2979427cc2130c28aeeecaac4063790b5f6ffb9838ebdb"} Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719301 5025 scope.go:117] "RemoveContainer" containerID="cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.719629 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.721735 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2290c61-7eaf-4296-9b09-6ba10e01e7b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.741440 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.741414516 podStartE2EDuration="3.741414516s" podCreationTimestamp="2025-10-07 08:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:15.733586569 +0000 UTC m=+1182.542900713" watchObservedRunningTime="2025-10-07 08:36:15.741414516 +0000 UTC m=+1182.550728700" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.755133 5025 scope.go:117] "RemoveContainer" containerID="0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.769286 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.769258803 podStartE2EDuration="3.769258803s" podCreationTimestamp="2025-10-07 08:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:15.764326198 +0000 UTC m=+1182.573640382" watchObservedRunningTime="2025-10-07 08:36:15.769258803 +0000 UTC m=+1182.578572987" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.788873 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.795928 5025 scope.go:117] "RemoveContainer" containerID="b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.801650 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.813155 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:15 crc kubenswrapper[5025]: E1007 08:36:15.813630 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="ceilometer-central-agent" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.813648 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="ceilometer-central-agent" Oct 07 08:36:15 crc kubenswrapper[5025]: E1007 08:36:15.813674 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="proxy-httpd" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.813683 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="proxy-httpd" Oct 07 08:36:15 crc kubenswrapper[5025]: E1007 08:36:15.813708 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="sg-core" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.813716 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="sg-core" Oct 07 08:36:15 crc kubenswrapper[5025]: E1007 08:36:15.813731 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="ceilometer-notification-agent" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.813739 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="ceilometer-notification-agent" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.813969 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="sg-core" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.813987 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="ceilometer-notification-agent" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.813999 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="ceilometer-central-agent" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.814022 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" containerName="proxy-httpd" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.817981 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.820712 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.821065 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.824390 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.826787 5025 scope.go:117] "RemoveContainer" containerID="1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.862495 5025 scope.go:117] "RemoveContainer" containerID="cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869" Oct 07 08:36:15 crc kubenswrapper[5025]: E1007 08:36:15.863016 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869\": container with ID starting with cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869 not found: ID does not exist" containerID="cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.863065 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869"} err="failed to get container status \"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869\": rpc error: code = NotFound desc = could not find container \"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869\": container with ID starting with cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.863095 5025 scope.go:117] "RemoveContainer" containerID="0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1" Oct 07 08:36:15 crc kubenswrapper[5025]: E1007 08:36:15.863438 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1\": container with ID starting with 0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1 not found: ID does not exist" containerID="0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.863470 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1"} err="failed to get container status \"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1\": rpc error: code = NotFound desc = could not find container \"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1\": container with ID starting with 0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.863492 5025 scope.go:117] "RemoveContainer" containerID="b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa" Oct 07 08:36:15 crc kubenswrapper[5025]: E1007 08:36:15.863834 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa\": container with ID starting with b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa not found: ID does not exist" containerID="b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.863863 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa"} err="failed to get container status \"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa\": rpc error: code = NotFound desc = could not find container \"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa\": container with ID starting with b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.863880 5025 scope.go:117] "RemoveContainer" containerID="1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9" Oct 07 08:36:15 crc kubenswrapper[5025]: E1007 08:36:15.864138 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9\": container with ID starting with 1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9 not found: ID does not exist" containerID="1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.864172 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9"} err="failed to get container status \"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9\": rpc error: code = NotFound desc = could not find container \"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9\": container with ID starting with 1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.864187 5025 scope.go:117] "RemoveContainer" containerID="cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.864453 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869"} err="failed to get container status \"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869\": rpc error: code = NotFound desc = could not find container \"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869\": container with ID starting with cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.864473 5025 scope.go:117] "RemoveContainer" containerID="0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.864740 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1"} err="failed to get container status \"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1\": rpc error: code = NotFound desc = could not find container \"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1\": container with ID starting with 0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.864760 5025 scope.go:117] "RemoveContainer" containerID="b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.865002 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa"} err="failed to get container status \"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa\": rpc error: code = NotFound desc = could not find container \"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa\": container with ID starting with b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.865025 5025 scope.go:117] "RemoveContainer" containerID="1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.865280 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9"} err="failed to get container status \"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9\": rpc error: code = NotFound desc = could not find container \"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9\": container with ID starting with 1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.865305 5025 scope.go:117] "RemoveContainer" containerID="cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.865503 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869"} err="failed to get container status \"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869\": rpc error: code = NotFound desc = could not find container \"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869\": container with ID starting with cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.865526 5025 scope.go:117] "RemoveContainer" containerID="0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.865743 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1"} err="failed to get container status \"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1\": rpc error: code = NotFound desc = could not find container \"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1\": container with ID starting with 0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.865784 5025 scope.go:117] "RemoveContainer" containerID="b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.866014 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa"} err="failed to get container status \"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa\": rpc error: code = NotFound desc = could not find container \"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa\": container with ID starting with b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.866038 5025 scope.go:117] "RemoveContainer" containerID="1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.866296 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9"} err="failed to get container status \"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9\": rpc error: code = NotFound desc = could not find container \"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9\": container with ID starting with 1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.866322 5025 scope.go:117] "RemoveContainer" containerID="cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.866740 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869"} err="failed to get container status \"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869\": rpc error: code = NotFound desc = could not find container \"cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869\": container with ID starting with cc2dafceb9e5b3f1801d94afe60e385996fc19566090d1fe48c4092c27123869 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.866764 5025 scope.go:117] "RemoveContainer" containerID="0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.867008 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1"} err="failed to get container status \"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1\": rpc error: code = NotFound desc = could not find container \"0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1\": container with ID starting with 0521661c8b58c51672306a801782a528dce06023fbe3cb33d42e948bf14a41e1 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.867059 5025 scope.go:117] "RemoveContainer" containerID="b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.867236 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa"} err="failed to get container status \"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa\": rpc error: code = NotFound desc = could not find container \"b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa\": container with ID starting with b0c7ec64fd15b0457197cc369238cfb8391fea24b78c80bf4e4245d1f9b26ffa not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.867259 5025 scope.go:117] "RemoveContainer" containerID="1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.867488 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9"} err="failed to get container status \"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9\": rpc error: code = NotFound desc = could not find container \"1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9\": container with ID starting with 1da7664930428da5d0724156885a637303be14cdbdcc7fa35cb048675dc5f3d9 not found: ID does not exist" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.925129 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-log-httpd\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.925201 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-scripts\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.925236 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvbh4\" (UniqueName: \"kubernetes.io/projected/6b83b7d6-5d79-41e1-9c95-555d7bbef815-kube-api-access-bvbh4\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.925338 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-run-httpd\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.925358 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.925506 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.925535 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-config-data\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:15 crc kubenswrapper[5025]: I1007 08:36:15.928179 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2290c61-7eaf-4296-9b09-6ba10e01e7b3" path="/var/lib/kubelet/pods/d2290c61-7eaf-4296-9b09-6ba10e01e7b3/volumes" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.026894 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-config-data\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.026978 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-log-httpd\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.027026 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-scripts\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.027052 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvbh4\" (UniqueName: \"kubernetes.io/projected/6b83b7d6-5d79-41e1-9c95-555d7bbef815-kube-api-access-bvbh4\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.027102 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-run-httpd\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.027121 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.027172 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.028247 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-log-httpd\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.028299 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-run-httpd\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.032013 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.032866 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.033059 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-config-data\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.040128 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-scripts\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.044635 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvbh4\" (UniqueName: \"kubernetes.io/projected/6b83b7d6-5d79-41e1-9c95-555d7bbef815-kube-api-access-bvbh4\") pod \"ceilometer-0\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.139691 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.575620 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:16 crc kubenswrapper[5025]: W1007 08:36:16.580998 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b83b7d6_5d79_41e1_9c95_555d7bbef815.slice/crio-916e7c9ca564ec26075792693629c0d25a9ff3146f828656d4b2834f0099be0b WatchSource:0}: Error finding container 916e7c9ca564ec26075792693629c0d25a9ff3146f828656d4b2834f0099be0b: Status 404 returned error can't find the container with id 916e7c9ca564ec26075792693629c0d25a9ff3146f828656d4b2834f0099be0b Oct 07 08:36:16 crc kubenswrapper[5025]: I1007 08:36:16.737028 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerStarted","Data":"916e7c9ca564ec26075792693629c0d25a9ff3146f828656d4b2834f0099be0b"} Oct 07 08:36:17 crc kubenswrapper[5025]: I1007 08:36:17.754020 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerStarted","Data":"cc237635d00ad9e0a790b76dd95c965f8e0d885520cc51472c85d97b6c821f6b"} Oct 07 08:36:18 crc kubenswrapper[5025]: I1007 08:36:18.763696 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerStarted","Data":"35d788691eaf6fda609dd639420adfaf4c89b7b6bff7fc549bcfaf92048a81ba"} Oct 07 08:36:19 crc kubenswrapper[5025]: I1007 08:36:19.776119 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerStarted","Data":"ab1dd94d7eba0c12be53b0445a4a5179105e5685deead257e4e717e9f31175d0"} Oct 07 08:36:20 crc kubenswrapper[5025]: I1007 08:36:20.788922 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerStarted","Data":"000033ec05e66e7adf5b84812f5d428fb36ae5b59275f223f7b9bebc0182fb3a"} Oct 07 08:36:20 crc kubenswrapper[5025]: I1007 08:36:20.789318 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 08:36:22 crc kubenswrapper[5025]: I1007 08:36:22.815962 5025 generic.go:334] "Generic (PLEG): container finished" podID="a910c5fd-a686-4b9f-b1bd-6cdcf4641013" containerID="1658af060071457045658f1a0669689a5e3ecc8ca296630d6d86cf21a5155bc2" exitCode=0 Oct 07 08:36:22 crc kubenswrapper[5025]: I1007 08:36:22.816055 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pw82v" event={"ID":"a910c5fd-a686-4b9f-b1bd-6cdcf4641013","Type":"ContainerDied","Data":"1658af060071457045658f1a0669689a5e3ecc8ca296630d6d86cf21a5155bc2"} Oct 07 08:36:22 crc kubenswrapper[5025]: I1007 08:36:22.846446 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.41924893 podStartE2EDuration="7.846424606s" podCreationTimestamp="2025-10-07 08:36:15 +0000 UTC" firstStartedPulling="2025-10-07 08:36:16.584457577 +0000 UTC m=+1183.393771731" lastFinishedPulling="2025-10-07 08:36:20.011633233 +0000 UTC m=+1186.820947407" observedRunningTime="2025-10-07 08:36:20.82567364 +0000 UTC m=+1187.634987794" watchObservedRunningTime="2025-10-07 08:36:22.846424606 +0000 UTC m=+1189.655738760" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.210225 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.210891 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.265098 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.293919 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.351828 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.351897 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.381808 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.411349 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.831073 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.831121 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.831137 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 08:36:23 crc kubenswrapper[5025]: I1007 08:36:23.831148 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.200243 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.322291 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-scripts\") pod \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.322749 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkl5l\" (UniqueName: \"kubernetes.io/projected/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-kube-api-access-pkl5l\") pod \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.322811 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-combined-ca-bundle\") pod \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.323003 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-config-data\") pod \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\" (UID: \"a910c5fd-a686-4b9f-b1bd-6cdcf4641013\") " Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.339822 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-scripts" (OuterVolumeSpecName: "scripts") pod "a910c5fd-a686-4b9f-b1bd-6cdcf4641013" (UID: "a910c5fd-a686-4b9f-b1bd-6cdcf4641013"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.340013 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-kube-api-access-pkl5l" (OuterVolumeSpecName: "kube-api-access-pkl5l") pod "a910c5fd-a686-4b9f-b1bd-6cdcf4641013" (UID: "a910c5fd-a686-4b9f-b1bd-6cdcf4641013"). InnerVolumeSpecName "kube-api-access-pkl5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.358091 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-config-data" (OuterVolumeSpecName: "config-data") pod "a910c5fd-a686-4b9f-b1bd-6cdcf4641013" (UID: "a910c5fd-a686-4b9f-b1bd-6cdcf4641013"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.363876 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a910c5fd-a686-4b9f-b1bd-6cdcf4641013" (UID: "a910c5fd-a686-4b9f-b1bd-6cdcf4641013"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.425190 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.425232 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.425247 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkl5l\" (UniqueName: \"kubernetes.io/projected/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-kube-api-access-pkl5l\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.425261 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a910c5fd-a686-4b9f-b1bd-6cdcf4641013-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.848448 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pw82v" event={"ID":"a910c5fd-a686-4b9f-b1bd-6cdcf4641013","Type":"ContainerDied","Data":"419ac94e7f7340b9288d5527e55ba06bcbf1cb3d500a77026ce5a9f1bd807878"} Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.849838 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419ac94e7f7340b9288d5527e55ba06bcbf1cb3d500a77026ce5a9f1bd807878" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.848867 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pw82v" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.970275 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 08:36:24 crc kubenswrapper[5025]: E1007 08:36:24.970877 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a910c5fd-a686-4b9f-b1bd-6cdcf4641013" containerName="nova-cell0-conductor-db-sync" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.970898 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a910c5fd-a686-4b9f-b1bd-6cdcf4641013" containerName="nova-cell0-conductor-db-sync" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.971094 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="a910c5fd-a686-4b9f-b1bd-6cdcf4641013" containerName="nova-cell0-conductor-db-sync" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.971908 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.978175 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.978249 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v44f9" Oct 07 08:36:24 crc kubenswrapper[5025]: I1007 08:36:24.990714 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.036315 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.036374 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxzgb\" (UniqueName: \"kubernetes.io/projected/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-kube-api-access-fxzgb\") pod \"nova-cell0-conductor-0\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.037901 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.139356 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.139407 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxzgb\" (UniqueName: \"kubernetes.io/projected/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-kube-api-access-fxzgb\") pod \"nova-cell0-conductor-0\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.139583 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.146178 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.157907 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.161617 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxzgb\" (UniqueName: \"kubernetes.io/projected/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-kube-api-access-fxzgb\") pod \"nova-cell0-conductor-0\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.295722 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.812155 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 08:36:25 crc kubenswrapper[5025]: W1007 08:36:25.814712 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod073c6d0a_3fea_4e0b_8f8f_f58613b4bf23.slice/crio-ca6bec0365e791f9961afde97c8eb9e181b2043eded1992f8ad467c13e2a43ac WatchSource:0}: Error finding container ca6bec0365e791f9961afde97c8eb9e181b2043eded1992f8ad467c13e2a43ac: Status 404 returned error can't find the container with id ca6bec0365e791f9961afde97c8eb9e181b2043eded1992f8ad467c13e2a43ac Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.858704 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23","Type":"ContainerStarted","Data":"ca6bec0365e791f9961afde97c8eb9e181b2043eded1992f8ad467c13e2a43ac"} Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.934331 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.934406 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.952755 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.952867 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.973812 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:25 crc kubenswrapper[5025]: I1007 08:36:25.973898 5025 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 08:36:26 crc kubenswrapper[5025]: I1007 08:36:26.018604 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 08:36:26 crc kubenswrapper[5025]: I1007 08:36:26.080914 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 08:36:26 crc kubenswrapper[5025]: I1007 08:36:26.877104 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23","Type":"ContainerStarted","Data":"8fadf8cac92c695c60339b695323e8872f1a70c0fa6297fd6b45f17e85f81906"} Oct 07 08:36:26 crc kubenswrapper[5025]: I1007 08:36:26.877154 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:26 crc kubenswrapper[5025]: I1007 08:36:26.905137 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.905112028 podStartE2EDuration="2.905112028s" podCreationTimestamp="2025-10-07 08:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:26.89662323 +0000 UTC m=+1193.705937374" watchObservedRunningTime="2025-10-07 08:36:26.905112028 +0000 UTC m=+1193.714426172" Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.340980 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.877012 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4mq6r"] Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.878830 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.881000 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.881191 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.899620 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4mq6r"] Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.951439 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rdxl\" (UniqueName: \"kubernetes.io/projected/c434c871-bc74-45ae-b5ed-810a796622d9-kube-api-access-7rdxl\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.951525 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-scripts\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.951914 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-config-data\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:30 crc kubenswrapper[5025]: I1007 08:36:30.952334 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.053872 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-scripts\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.054275 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-config-data\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.054311 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.054389 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rdxl\" (UniqueName: \"kubernetes.io/projected/c434c871-bc74-45ae-b5ed-810a796622d9-kube-api-access-7rdxl\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.066432 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.070067 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-config-data\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.072066 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-scripts\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.082775 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rdxl\" (UniqueName: \"kubernetes.io/projected/c434c871-bc74-45ae-b5ed-810a796622d9-kube-api-access-7rdxl\") pod \"nova-cell0-cell-mapping-4mq6r\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.084337 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.086422 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.099993 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.113982 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.145499 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.147107 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.154364 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.163428 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzzmf\" (UniqueName: \"kubernetes.io/projected/411bbb25-18d9-4957-afff-2ebcd6866aeb-kube-api-access-hzzmf\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.163592 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-config-data\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.163813 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411bbb25-18d9-4957-afff-2ebcd6866aeb-logs\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.163933 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.190015 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.207068 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.269183 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.269261 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.269343 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzzmf\" (UniqueName: \"kubernetes.io/projected/411bbb25-18d9-4957-afff-2ebcd6866aeb-kube-api-access-hzzmf\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.269389 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-config-data\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.269434 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65rg\" (UniqueName: \"kubernetes.io/projected/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-kube-api-access-m65rg\") pod \"nova-scheduler-0\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.269479 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-config-data\") pod \"nova-scheduler-0\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.269516 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411bbb25-18d9-4957-afff-2ebcd6866aeb-logs\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.270055 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411bbb25-18d9-4957-afff-2ebcd6866aeb-logs\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.285291 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.295321 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.300393 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.308270 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.309346 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-config-data\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.365033 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.377995 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.378045 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-logs\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.378097 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-config-data\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.378119 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65rg\" (UniqueName: \"kubernetes.io/projected/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-kube-api-access-m65rg\") pod \"nova-scheduler-0\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.378203 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-config-data\") pod \"nova-scheduler-0\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.378342 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwqvq\" (UniqueName: \"kubernetes.io/projected/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-kube-api-access-qwqvq\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.378439 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.394639 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-config-data\") pod \"nova-scheduler-0\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.396467 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.404510 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzzmf\" (UniqueName: \"kubernetes.io/projected/411bbb25-18d9-4957-afff-2ebcd6866aeb-kube-api-access-hzzmf\") pod \"nova-api-0\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.439357 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65rg\" (UniqueName: \"kubernetes.io/projected/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-kube-api-access-m65rg\") pod \"nova-scheduler-0\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.450196 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.459310 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.471276 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.485634 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.485705 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25k9v\" (UniqueName: \"kubernetes.io/projected/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-kube-api-access-25k9v\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.485750 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwqvq\" (UniqueName: \"kubernetes.io/projected/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-kube-api-access-qwqvq\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.485776 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.485898 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.485927 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-logs\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.485948 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-config-data\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.489251 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-logs\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.491152 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-config-data\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.491172 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.509515 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.534280 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwqvq\" (UniqueName: \"kubernetes.io/projected/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-kube-api-access-qwqvq\") pod \"nova-metadata-0\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.592762 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.592841 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25k9v\" (UniqueName: \"kubernetes.io/projected/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-kube-api-access-25k9v\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.592897 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.603952 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.613640 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.623432 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25k9v\" (UniqueName: \"kubernetes.io/projected/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-kube-api-access-25k9v\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.631618 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g44d8"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.633506 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.661762 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.682617 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g44d8"] Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.698590 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckfps\" (UniqueName: \"kubernetes.io/projected/4add832c-94fb-40d1-bdd5-4a3671c38fd7-kube-api-access-ckfps\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.698711 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.698749 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.698794 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.698820 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-config\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.698915 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-svc\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.719320 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.752117 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.800721 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.800784 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-config\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.800901 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-svc\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.800950 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckfps\" (UniqueName: \"kubernetes.io/projected/4add832c-94fb-40d1-bdd5-4a3671c38fd7-kube-api-access-ckfps\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.801019 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.801064 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.802025 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.802283 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-svc\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.802460 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.803046 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-config\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.803060 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.819436 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckfps\" (UniqueName: \"kubernetes.io/projected/4add832c-94fb-40d1-bdd5-4a3671c38fd7-kube-api-access-ckfps\") pod \"dnsmasq-dns-865f5d856f-g44d8\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.820589 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:31 crc kubenswrapper[5025]: I1007 08:36:31.969404 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.064303 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4mq6r"] Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.285248 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.306285 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g6spt"] Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.321243 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.324743 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g6spt"] Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.325976 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.326305 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.382549 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.404861 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:36:32 crc kubenswrapper[5025]: W1007 08:36:32.411749 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod986412ee_2fb7_413a_a1d3_355bfc0ac0ec.slice/crio-b1f7dfe70c14471dfb55c2b7281e5e81f78aa681e8dd6f5578d5639199e5fac0 WatchSource:0}: Error finding container b1f7dfe70c14471dfb55c2b7281e5e81f78aa681e8dd6f5578d5639199e5fac0: Status 404 returned error can't find the container with id b1f7dfe70c14471dfb55c2b7281e5e81f78aa681e8dd6f5578d5639199e5fac0 Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.413986 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtnnk\" (UniqueName: \"kubernetes.io/projected/d65b533a-25ed-4f95-a2f8-84ee04697636-kube-api-access-qtnnk\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.414086 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-scripts\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.414143 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-config-data\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.414170 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.516024 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-scripts\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.516103 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-config-data\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.516132 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.516188 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtnnk\" (UniqueName: \"kubernetes.io/projected/d65b533a-25ed-4f95-a2f8-84ee04697636-kube-api-access-qtnnk\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.519900 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.521618 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-scripts\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.522222 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-config-data\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.534031 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.535309 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtnnk\" (UniqueName: \"kubernetes.io/projected/d65b533a-25ed-4f95-a2f8-84ee04697636-kube-api-access-qtnnk\") pod \"nova-cell1-conductor-db-sync-g6spt\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.643834 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g44d8"] Oct 07 08:36:32 crc kubenswrapper[5025]: W1007 08:36:32.643852 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4add832c_94fb_40d1_bdd5_4a3671c38fd7.slice/crio-b8cf9bc8794f945dee5aa4d0c2a0faf9a35e8720693d50cd217ce0a00f073379 WatchSource:0}: Error finding container b8cf9bc8794f945dee5aa4d0c2a0faf9a35e8720693d50cd217ce0a00f073379: Status 404 returned error can't find the container with id b8cf9bc8794f945dee5aa4d0c2a0faf9a35e8720693d50cd217ce0a00f073379 Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.705816 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.973156 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22","Type":"ContainerStarted","Data":"9cac546dc494bee91d7275ac53226279be24c00e356c092d63a690718ac1bc2c"} Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.981200 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"986412ee-2fb7-413a-a1d3-355bfc0ac0ec","Type":"ContainerStarted","Data":"b1f7dfe70c14471dfb55c2b7281e5e81f78aa681e8dd6f5578d5639199e5fac0"} Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.995969 5025 generic.go:334] "Generic (PLEG): container finished" podID="4add832c-94fb-40d1-bdd5-4a3671c38fd7" containerID="a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef" exitCode=0 Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.996304 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" event={"ID":"4add832c-94fb-40d1-bdd5-4a3671c38fd7","Type":"ContainerDied","Data":"a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef"} Oct 07 08:36:32 crc kubenswrapper[5025]: I1007 08:36:32.996377 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" event={"ID":"4add832c-94fb-40d1-bdd5-4a3671c38fd7","Type":"ContainerStarted","Data":"b8cf9bc8794f945dee5aa4d0c2a0faf9a35e8720693d50cd217ce0a00f073379"} Oct 07 08:36:33 crc kubenswrapper[5025]: I1007 08:36:33.011100 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411bbb25-18d9-4957-afff-2ebcd6866aeb","Type":"ContainerStarted","Data":"d2343f881e3de627c2f1f3c15ddcae177179a9aa6bca574588fbd20cc28a1fd3"} Oct 07 08:36:33 crc kubenswrapper[5025]: I1007 08:36:33.015032 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e2aa2df-b2fa-4dd5-bc17-99644c08a859","Type":"ContainerStarted","Data":"73e3228ea82df180e7495da1f55bc581491f3901d7c385030a75ed4bc323bca2"} Oct 07 08:36:33 crc kubenswrapper[5025]: I1007 08:36:33.017585 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4mq6r" event={"ID":"c434c871-bc74-45ae-b5ed-810a796622d9","Type":"ContainerStarted","Data":"d483b8c35a3ab7b95943ae78b74c71128746cf8491cf043bfc10563ed70df854"} Oct 07 08:36:33 crc kubenswrapper[5025]: I1007 08:36:33.017612 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4mq6r" event={"ID":"c434c871-bc74-45ae-b5ed-810a796622d9","Type":"ContainerStarted","Data":"c19293377c6d040839b6abacdc549885f8126bc7ac3fec7176e13184ff078a8e"} Oct 07 08:36:33 crc kubenswrapper[5025]: I1007 08:36:33.055297 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4mq6r" podStartSLOduration=3.055276294 podStartE2EDuration="3.055276294s" podCreationTimestamp="2025-10-07 08:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:33.05100563 +0000 UTC m=+1199.860319774" watchObservedRunningTime="2025-10-07 08:36:33.055276294 +0000 UTC m=+1199.864590428" Oct 07 08:36:33 crc kubenswrapper[5025]: I1007 08:36:33.209452 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g6spt"] Oct 07 08:36:33 crc kubenswrapper[5025]: W1007 08:36:33.239376 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd65b533a_25ed_4f95_a2f8_84ee04697636.slice/crio-3efbf2f061835309be1e74a8421d8aef30befca5d6cf32352b89730df46e9c53 WatchSource:0}: Error finding container 3efbf2f061835309be1e74a8421d8aef30befca5d6cf32352b89730df46e9c53: Status 404 returned error can't find the container with id 3efbf2f061835309be1e74a8421d8aef30befca5d6cf32352b89730df46e9c53 Oct 07 08:36:34 crc kubenswrapper[5025]: I1007 08:36:34.034300 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-g6spt" event={"ID":"d65b533a-25ed-4f95-a2f8-84ee04697636","Type":"ContainerStarted","Data":"0d717114726acf550f907c6257fff0023c7b2d0e41f234c82b856a0d923017ad"} Oct 07 08:36:34 crc kubenswrapper[5025]: I1007 08:36:34.034571 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-g6spt" event={"ID":"d65b533a-25ed-4f95-a2f8-84ee04697636","Type":"ContainerStarted","Data":"3efbf2f061835309be1e74a8421d8aef30befca5d6cf32352b89730df46e9c53"} Oct 07 08:36:34 crc kubenswrapper[5025]: I1007 08:36:34.037525 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" event={"ID":"4add832c-94fb-40d1-bdd5-4a3671c38fd7","Type":"ContainerStarted","Data":"d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077"} Oct 07 08:36:34 crc kubenswrapper[5025]: I1007 08:36:34.054024 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-g6spt" podStartSLOduration=2.054008291 podStartE2EDuration="2.054008291s" podCreationTimestamp="2025-10-07 08:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:34.048380823 +0000 UTC m=+1200.857694967" watchObservedRunningTime="2025-10-07 08:36:34.054008291 +0000 UTC m=+1200.863322435" Oct 07 08:36:35 crc kubenswrapper[5025]: I1007 08:36:35.051430 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:35 crc kubenswrapper[5025]: I1007 08:36:35.612799 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" podStartSLOduration=4.612781441 podStartE2EDuration="4.612781441s" podCreationTimestamp="2025-10-07 08:36:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:34.071372358 +0000 UTC m=+1200.880686512" watchObservedRunningTime="2025-10-07 08:36:35.612781441 +0000 UTC m=+1202.422095585" Oct 07 08:36:35 crc kubenswrapper[5025]: I1007 08:36:35.615101 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:35 crc kubenswrapper[5025]: I1007 08:36:35.628258 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:36:36 crc kubenswrapper[5025]: I1007 08:36:36.059165 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e2aa2df-b2fa-4dd5-bc17-99644c08a859","Type":"ContainerStarted","Data":"d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed"} Oct 07 08:36:36 crc kubenswrapper[5025]: I1007 08:36:36.064980 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411bbb25-18d9-4957-afff-2ebcd6866aeb","Type":"ContainerStarted","Data":"29cbed0d4183e69c462b96688d81104df2f4003866f56cfcfbd6f94622bd80c1"} Oct 07 08:36:36 crc kubenswrapper[5025]: I1007 08:36:36.067291 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22","Type":"ContainerStarted","Data":"77f7516d7962feba3cc7f2520d60c9bad2c5da055f1c07185c88455c01f36edc"} Oct 07 08:36:36 crc kubenswrapper[5025]: I1007 08:36:36.067412 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://77f7516d7962feba3cc7f2520d60c9bad2c5da055f1c07185c88455c01f36edc" gracePeriod=30 Oct 07 08:36:36 crc kubenswrapper[5025]: I1007 08:36:36.075849 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"986412ee-2fb7-413a-a1d3-355bfc0ac0ec","Type":"ContainerStarted","Data":"ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5"} Oct 07 08:36:36 crc kubenswrapper[5025]: I1007 08:36:36.090798 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.07381982 podStartE2EDuration="5.090782321s" podCreationTimestamp="2025-10-07 08:36:31 +0000 UTC" firstStartedPulling="2025-10-07 08:36:32.529696607 +0000 UTC m=+1199.339010751" lastFinishedPulling="2025-10-07 08:36:35.546659108 +0000 UTC m=+1202.355973252" observedRunningTime="2025-10-07 08:36:36.081237111 +0000 UTC m=+1202.890551255" watchObservedRunningTime="2025-10-07 08:36:36.090782321 +0000 UTC m=+1202.900096465" Oct 07 08:36:36 crc kubenswrapper[5025]: I1007 08:36:36.105910 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.975160561 podStartE2EDuration="5.105893138s" podCreationTimestamp="2025-10-07 08:36:31 +0000 UTC" firstStartedPulling="2025-10-07 08:36:32.415896441 +0000 UTC m=+1199.225210585" lastFinishedPulling="2025-10-07 08:36:35.546629018 +0000 UTC m=+1202.355943162" observedRunningTime="2025-10-07 08:36:36.100053573 +0000 UTC m=+1202.909367717" watchObservedRunningTime="2025-10-07 08:36:36.105893138 +0000 UTC m=+1202.915207272" Oct 07 08:36:36 crc kubenswrapper[5025]: I1007 08:36:36.719840 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 08:36:36 crc kubenswrapper[5025]: I1007 08:36:36.822086 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.087370 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411bbb25-18d9-4957-afff-2ebcd6866aeb","Type":"ContainerStarted","Data":"602618541b175c2a97eb8d75f28dbf882389261fee09f23050a909a8840e022e"} Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.090925 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e2aa2df-b2fa-4dd5-bc17-99644c08a859","Type":"ContainerStarted","Data":"1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7"} Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.091346 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerName="nova-metadata-log" containerID="cri-o://d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed" gracePeriod=30 Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.091483 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerName="nova-metadata-metadata" containerID="cri-o://1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7" gracePeriod=30 Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.123593 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.905313297 podStartE2EDuration="6.12356702s" podCreationTimestamp="2025-10-07 08:36:31 +0000 UTC" firstStartedPulling="2025-10-07 08:36:32.329558881 +0000 UTC m=+1199.138873025" lastFinishedPulling="2025-10-07 08:36:35.547812604 +0000 UTC m=+1202.357126748" observedRunningTime="2025-10-07 08:36:37.111298693 +0000 UTC m=+1203.920612837" watchObservedRunningTime="2025-10-07 08:36:37.12356702 +0000 UTC m=+1203.932881204" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.142516 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.925642946 podStartE2EDuration="6.142497516s" podCreationTimestamp="2025-10-07 08:36:31 +0000 UTC" firstStartedPulling="2025-10-07 08:36:32.386257887 +0000 UTC m=+1199.195572021" lastFinishedPulling="2025-10-07 08:36:35.603112447 +0000 UTC m=+1202.412426591" observedRunningTime="2025-10-07 08:36:37.127613008 +0000 UTC m=+1203.936927162" watchObservedRunningTime="2025-10-07 08:36:37.142497516 +0000 UTC m=+1203.951811660" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.707962 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.832466 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwqvq\" (UniqueName: \"kubernetes.io/projected/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-kube-api-access-qwqvq\") pod \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.832524 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-logs\") pod \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.832634 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-config-data\") pod \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.832794 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-combined-ca-bundle\") pod \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\" (UID: \"4e2aa2df-b2fa-4dd5-bc17-99644c08a859\") " Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.834293 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-logs" (OuterVolumeSpecName: "logs") pod "4e2aa2df-b2fa-4dd5-bc17-99644c08a859" (UID: "4e2aa2df-b2fa-4dd5-bc17-99644c08a859"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.845893 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-kube-api-access-qwqvq" (OuterVolumeSpecName: "kube-api-access-qwqvq") pod "4e2aa2df-b2fa-4dd5-bc17-99644c08a859" (UID: "4e2aa2df-b2fa-4dd5-bc17-99644c08a859"). InnerVolumeSpecName "kube-api-access-qwqvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.861313 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-config-data" (OuterVolumeSpecName: "config-data") pod "4e2aa2df-b2fa-4dd5-bc17-99644c08a859" (UID: "4e2aa2df-b2fa-4dd5-bc17-99644c08a859"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.874362 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e2aa2df-b2fa-4dd5-bc17-99644c08a859" (UID: "4e2aa2df-b2fa-4dd5-bc17-99644c08a859"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.938834 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwqvq\" (UniqueName: \"kubernetes.io/projected/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-kube-api-access-qwqvq\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.938862 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.938880 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:37 crc kubenswrapper[5025]: I1007 08:36:37.938889 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2aa2df-b2fa-4dd5-bc17-99644c08a859-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.101920 5025 generic.go:334] "Generic (PLEG): container finished" podID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerID="1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7" exitCode=0 Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.101960 5025 generic.go:334] "Generic (PLEG): container finished" podID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerID="d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed" exitCode=143 Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.101969 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.102020 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e2aa2df-b2fa-4dd5-bc17-99644c08a859","Type":"ContainerDied","Data":"1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7"} Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.102070 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e2aa2df-b2fa-4dd5-bc17-99644c08a859","Type":"ContainerDied","Data":"d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed"} Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.102084 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e2aa2df-b2fa-4dd5-bc17-99644c08a859","Type":"ContainerDied","Data":"73e3228ea82df180e7495da1f55bc581491f3901d7c385030a75ed4bc323bca2"} Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.102103 5025 scope.go:117] "RemoveContainer" containerID="1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.124970 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.137921 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.144642 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:38 crc kubenswrapper[5025]: E1007 08:36:38.145069 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerName="nova-metadata-metadata" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.145087 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerName="nova-metadata-metadata" Oct 07 08:36:38 crc kubenswrapper[5025]: E1007 08:36:38.145131 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerName="nova-metadata-log" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.145138 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerName="nova-metadata-log" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.145310 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerName="nova-metadata-log" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.145324 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" containerName="nova-metadata-metadata" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.149008 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.151189 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.151653 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.162793 5025 scope.go:117] "RemoveContainer" containerID="d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.171382 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.189257 5025 scope.go:117] "RemoveContainer" containerID="1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7" Oct 07 08:36:38 crc kubenswrapper[5025]: E1007 08:36:38.189772 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7\": container with ID starting with 1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7 not found: ID does not exist" containerID="1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.189861 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7"} err="failed to get container status \"1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7\": rpc error: code = NotFound desc = could not find container \"1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7\": container with ID starting with 1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7 not found: ID does not exist" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.189913 5025 scope.go:117] "RemoveContainer" containerID="d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed" Oct 07 08:36:38 crc kubenswrapper[5025]: E1007 08:36:38.191507 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed\": container with ID starting with d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed not found: ID does not exist" containerID="d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.191560 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed"} err="failed to get container status \"d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed\": rpc error: code = NotFound desc = could not find container \"d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed\": container with ID starting with d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed not found: ID does not exist" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.191582 5025 scope.go:117] "RemoveContainer" containerID="1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.191888 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7"} err="failed to get container status \"1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7\": rpc error: code = NotFound desc = could not find container \"1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7\": container with ID starting with 1a55513212872858591f97e5c5702f3f5220e6f7d08af50fe5a135a0771f6ce7 not found: ID does not exist" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.191915 5025 scope.go:117] "RemoveContainer" containerID="d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.192451 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed"} err="failed to get container status \"d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed\": rpc error: code = NotFound desc = could not find container \"d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed\": container with ID starting with d5b48fdfb262058bd13f9f881fa4835c96d857425ca96b78de50dbc59ee37fed not found: ID does not exist" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.245232 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.245302 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26gn\" (UniqueName: \"kubernetes.io/projected/760b67f6-71da-458d-a2e0-becb528f937e-kube-api-access-p26gn\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.245448 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.245520 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-config-data\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.245780 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/760b67f6-71da-458d-a2e0-becb528f937e-logs\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.347996 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.348042 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p26gn\" (UniqueName: \"kubernetes.io/projected/760b67f6-71da-458d-a2e0-becb528f937e-kube-api-access-p26gn\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.348151 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.348218 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-config-data\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.348285 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/760b67f6-71da-458d-a2e0-becb528f937e-logs\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.349156 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/760b67f6-71da-458d-a2e0-becb528f937e-logs\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.353755 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-config-data\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.354768 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.359172 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.366974 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26gn\" (UniqueName: \"kubernetes.io/projected/760b67f6-71da-458d-a2e0-becb528f937e-kube-api-access-p26gn\") pod \"nova-metadata-0\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.473379 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:38 crc kubenswrapper[5025]: I1007 08:36:38.917699 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:39 crc kubenswrapper[5025]: I1007 08:36:39.113483 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"760b67f6-71da-458d-a2e0-becb528f937e","Type":"ContainerStarted","Data":"204bf678b02a1a21bd67069d43c288bb0c8c4916d99736daf5b92a2cb9fc7880"} Oct 07 08:36:39 crc kubenswrapper[5025]: I1007 08:36:39.930814 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2aa2df-b2fa-4dd5-bc17-99644c08a859" path="/var/lib/kubelet/pods/4e2aa2df-b2fa-4dd5-bc17-99644c08a859/volumes" Oct 07 08:36:40 crc kubenswrapper[5025]: I1007 08:36:40.128726 5025 generic.go:334] "Generic (PLEG): container finished" podID="c434c871-bc74-45ae-b5ed-810a796622d9" containerID="d483b8c35a3ab7b95943ae78b74c71128746cf8491cf043bfc10563ed70df854" exitCode=0 Oct 07 08:36:40 crc kubenswrapper[5025]: I1007 08:36:40.128831 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4mq6r" event={"ID":"c434c871-bc74-45ae-b5ed-810a796622d9","Type":"ContainerDied","Data":"d483b8c35a3ab7b95943ae78b74c71128746cf8491cf043bfc10563ed70df854"} Oct 07 08:36:40 crc kubenswrapper[5025]: I1007 08:36:40.131612 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"760b67f6-71da-458d-a2e0-becb528f937e","Type":"ContainerStarted","Data":"5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61"} Oct 07 08:36:40 crc kubenswrapper[5025]: I1007 08:36:40.131664 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"760b67f6-71da-458d-a2e0-becb528f937e","Type":"ContainerStarted","Data":"08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238"} Oct 07 08:36:40 crc kubenswrapper[5025]: I1007 08:36:40.135237 5025 generic.go:334] "Generic (PLEG): container finished" podID="d65b533a-25ed-4f95-a2f8-84ee04697636" containerID="0d717114726acf550f907c6257fff0023c7b2d0e41f234c82b856a0d923017ad" exitCode=0 Oct 07 08:36:40 crc kubenswrapper[5025]: I1007 08:36:40.135275 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-g6spt" event={"ID":"d65b533a-25ed-4f95-a2f8-84ee04697636","Type":"ContainerDied","Data":"0d717114726acf550f907c6257fff0023c7b2d0e41f234c82b856a0d923017ad"} Oct 07 08:36:40 crc kubenswrapper[5025]: I1007 08:36:40.183502 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.183472906 podStartE2EDuration="2.183472906s" podCreationTimestamp="2025-10-07 08:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:40.172279773 +0000 UTC m=+1206.981593937" watchObservedRunningTime="2025-10-07 08:36:40.183472906 +0000 UTC m=+1206.992787060" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.620614 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.626734 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.662730 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.663081 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.712256 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-config-data\") pod \"d65b533a-25ed-4f95-a2f8-84ee04697636\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.712378 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-scripts\") pod \"d65b533a-25ed-4f95-a2f8-84ee04697636\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.712437 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-combined-ca-bundle\") pod \"d65b533a-25ed-4f95-a2f8-84ee04697636\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.712455 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtnnk\" (UniqueName: \"kubernetes.io/projected/d65b533a-25ed-4f95-a2f8-84ee04697636-kube-api-access-qtnnk\") pod \"d65b533a-25ed-4f95-a2f8-84ee04697636\" (UID: \"d65b533a-25ed-4f95-a2f8-84ee04697636\") " Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.719026 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-scripts" (OuterVolumeSpecName: "scripts") pod "d65b533a-25ed-4f95-a2f8-84ee04697636" (UID: "d65b533a-25ed-4f95-a2f8-84ee04697636"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.719732 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65b533a-25ed-4f95-a2f8-84ee04697636-kube-api-access-qtnnk" (OuterVolumeSpecName: "kube-api-access-qtnnk") pod "d65b533a-25ed-4f95-a2f8-84ee04697636" (UID: "d65b533a-25ed-4f95-a2f8-84ee04697636"). InnerVolumeSpecName "kube-api-access-qtnnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.719874 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.743109 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d65b533a-25ed-4f95-a2f8-84ee04697636" (UID: "d65b533a-25ed-4f95-a2f8-84ee04697636"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.748746 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.760549 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-config-data" (OuterVolumeSpecName: "config-data") pod "d65b533a-25ed-4f95-a2f8-84ee04697636" (UID: "d65b533a-25ed-4f95-a2f8-84ee04697636"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.814493 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rdxl\" (UniqueName: \"kubernetes.io/projected/c434c871-bc74-45ae-b5ed-810a796622d9-kube-api-access-7rdxl\") pod \"c434c871-bc74-45ae-b5ed-810a796622d9\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.814565 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-scripts\") pod \"c434c871-bc74-45ae-b5ed-810a796622d9\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.814779 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-combined-ca-bundle\") pod \"c434c871-bc74-45ae-b5ed-810a796622d9\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.814822 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-config-data\") pod \"c434c871-bc74-45ae-b5ed-810a796622d9\" (UID: \"c434c871-bc74-45ae-b5ed-810a796622d9\") " Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.815464 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.816065 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.816107 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65b533a-25ed-4f95-a2f8-84ee04697636-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.816125 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtnnk\" (UniqueName: \"kubernetes.io/projected/d65b533a-25ed-4f95-a2f8-84ee04697636-kube-api-access-qtnnk\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.819107 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-scripts" (OuterVolumeSpecName: "scripts") pod "c434c871-bc74-45ae-b5ed-810a796622d9" (UID: "c434c871-bc74-45ae-b5ed-810a796622d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.822866 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c434c871-bc74-45ae-b5ed-810a796622d9-kube-api-access-7rdxl" (OuterVolumeSpecName: "kube-api-access-7rdxl") pod "c434c871-bc74-45ae-b5ed-810a796622d9" (UID: "c434c871-bc74-45ae-b5ed-810a796622d9"). InnerVolumeSpecName "kube-api-access-7rdxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.845042 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-config-data" (OuterVolumeSpecName: "config-data") pod "c434c871-bc74-45ae-b5ed-810a796622d9" (UID: "c434c871-bc74-45ae-b5ed-810a796622d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.858123 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c434c871-bc74-45ae-b5ed-810a796622d9" (UID: "c434c871-bc74-45ae-b5ed-810a796622d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.917559 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.917606 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rdxl\" (UniqueName: \"kubernetes.io/projected/c434c871-bc74-45ae-b5ed-810a796622d9-kube-api-access-7rdxl\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.917619 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.917630 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434c871-bc74-45ae-b5ed-810a796622d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:41 crc kubenswrapper[5025]: I1007 08:36:41.971739 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.040469 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-s92q7"] Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.040715 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" podUID="9263b87e-395f-4a9f-b819-846f1378bd73" containerName="dnsmasq-dns" containerID="cri-o://cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32" gracePeriod=10 Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.156061 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-g6spt" event={"ID":"d65b533a-25ed-4f95-a2f8-84ee04697636","Type":"ContainerDied","Data":"3efbf2f061835309be1e74a8421d8aef30befca5d6cf32352b89730df46e9c53"} Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.156361 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3efbf2f061835309be1e74a8421d8aef30befca5d6cf32352b89730df46e9c53" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.156121 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-g6spt" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.158336 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4mq6r" event={"ID":"c434c871-bc74-45ae-b5ed-810a796622d9","Type":"ContainerDied","Data":"c19293377c6d040839b6abacdc549885f8126bc7ac3fec7176e13184ff078a8e"} Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.158368 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4mq6r" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.158366 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c19293377c6d040839b6abacdc549885f8126bc7ac3fec7176e13184ff078a8e" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.201933 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.296167 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 08:36:42 crc kubenswrapper[5025]: E1007 08:36:42.296656 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c434c871-bc74-45ae-b5ed-810a796622d9" containerName="nova-manage" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.296674 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c434c871-bc74-45ae-b5ed-810a796622d9" containerName="nova-manage" Oct 07 08:36:42 crc kubenswrapper[5025]: E1007 08:36:42.296714 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65b533a-25ed-4f95-a2f8-84ee04697636" containerName="nova-cell1-conductor-db-sync" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.296722 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65b533a-25ed-4f95-a2f8-84ee04697636" containerName="nova-cell1-conductor-db-sync" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.296903 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c434c871-bc74-45ae-b5ed-810a796622d9" containerName="nova-manage" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.296923 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65b533a-25ed-4f95-a2f8-84ee04697636" containerName="nova-cell1-conductor-db-sync" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.297550 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.304623 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.309745 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.429636 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.430029 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dfr\" (UniqueName: \"kubernetes.io/projected/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-kube-api-access-27dfr\") pod \"nova-cell1-conductor-0\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.430090 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.437425 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.437634 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-log" containerID="cri-o://29cbed0d4183e69c462b96688d81104df2f4003866f56cfcfbd6f94622bd80c1" gracePeriod=30 Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.438032 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-api" containerID="cri-o://602618541b175c2a97eb8d75f28dbf882389261fee09f23050a909a8840e022e" gracePeriod=30 Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.451133 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": EOF" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.453778 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": EOF" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.457091 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.457847 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="760b67f6-71da-458d-a2e0-becb528f937e" containerName="nova-metadata-metadata" containerID="cri-o://5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61" gracePeriod=30 Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.458786 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="760b67f6-71da-458d-a2e0-becb528f937e" containerName="nova-metadata-log" containerID="cri-o://08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238" gracePeriod=30 Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.531344 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.531395 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27dfr\" (UniqueName: \"kubernetes.io/projected/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-kube-api-access-27dfr\") pod \"nova-cell1-conductor-0\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.531438 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.540876 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.550105 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.550261 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27dfr\" (UniqueName: \"kubernetes.io/projected/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-kube-api-access-27dfr\") pod \"nova-cell1-conductor-0\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.615597 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.706848 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.816101 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.939168 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-config\") pod \"9263b87e-395f-4a9f-b819-846f1378bd73\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.939235 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-sb\") pod \"9263b87e-395f-4a9f-b819-846f1378bd73\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.939365 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-svc\") pod \"9263b87e-395f-4a9f-b819-846f1378bd73\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.939389 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-swift-storage-0\") pod \"9263b87e-395f-4a9f-b819-846f1378bd73\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.939413 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv2q4\" (UniqueName: \"kubernetes.io/projected/9263b87e-395f-4a9f-b819-846f1378bd73-kube-api-access-rv2q4\") pod \"9263b87e-395f-4a9f-b819-846f1378bd73\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.939534 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-nb\") pod \"9263b87e-395f-4a9f-b819-846f1378bd73\" (UID: \"9263b87e-395f-4a9f-b819-846f1378bd73\") " Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.960962 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9263b87e-395f-4a9f-b819-846f1378bd73-kube-api-access-rv2q4" (OuterVolumeSpecName: "kube-api-access-rv2q4") pod "9263b87e-395f-4a9f-b819-846f1378bd73" (UID: "9263b87e-395f-4a9f-b819-846f1378bd73"). InnerVolumeSpecName "kube-api-access-rv2q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:42 crc kubenswrapper[5025]: I1007 08:36:42.994500 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9263b87e-395f-4a9f-b819-846f1378bd73" (UID: "9263b87e-395f-4a9f-b819-846f1378bd73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.004217 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-config" (OuterVolumeSpecName: "config") pod "9263b87e-395f-4a9f-b819-846f1378bd73" (UID: "9263b87e-395f-4a9f-b819-846f1378bd73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.016885 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9263b87e-395f-4a9f-b819-846f1378bd73" (UID: "9263b87e-395f-4a9f-b819-846f1378bd73"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.028018 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9263b87e-395f-4a9f-b819-846f1378bd73" (UID: "9263b87e-395f-4a9f-b819-846f1378bd73"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.039422 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9263b87e-395f-4a9f-b819-846f1378bd73" (UID: "9263b87e-395f-4a9f-b819-846f1378bd73"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.043175 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.043282 5025 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.043374 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv2q4\" (UniqueName: \"kubernetes.io/projected/9263b87e-395f-4a9f-b819-846f1378bd73-kube-api-access-rv2q4\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.043433 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.043491 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.043616 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9263b87e-395f-4a9f-b819-846f1378bd73-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.072642 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.117625 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.167063 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5db22b1d-c7bd-4bc9-b30b-3271da7741e1","Type":"ContainerStarted","Data":"c0282852894896a47589fb5aca382e189ee07ceaa5e091b0655bf7af1b48f04c"} Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.170473 5025 generic.go:334] "Generic (PLEG): container finished" podID="9263b87e-395f-4a9f-b819-846f1378bd73" containerID="cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32" exitCode=0 Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.170562 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.170571 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" event={"ID":"9263b87e-395f-4a9f-b819-846f1378bd73","Type":"ContainerDied","Data":"cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32"} Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.170617 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-s92q7" event={"ID":"9263b87e-395f-4a9f-b819-846f1378bd73","Type":"ContainerDied","Data":"e654ea0b6a9e2505438ed37de230619e5bd152d84c17d252c622c186a85d6166"} Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.170634 5025 scope.go:117] "RemoveContainer" containerID="cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.174671 5025 generic.go:334] "Generic (PLEG): container finished" podID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerID="29cbed0d4183e69c462b96688d81104df2f4003866f56cfcfbd6f94622bd80c1" exitCode=143 Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.174744 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411bbb25-18d9-4957-afff-2ebcd6866aeb","Type":"ContainerDied","Data":"29cbed0d4183e69c462b96688d81104df2f4003866f56cfcfbd6f94622bd80c1"} Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.179482 5025 generic.go:334] "Generic (PLEG): container finished" podID="760b67f6-71da-458d-a2e0-becb528f937e" containerID="5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61" exitCode=0 Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.179513 5025 generic.go:334] "Generic (PLEG): container finished" podID="760b67f6-71da-458d-a2e0-becb528f937e" containerID="08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238" exitCode=143 Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.180358 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.180954 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"760b67f6-71da-458d-a2e0-becb528f937e","Type":"ContainerDied","Data":"5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61"} Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.180984 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"760b67f6-71da-458d-a2e0-becb528f937e","Type":"ContainerDied","Data":"08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238"} Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.180995 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"760b67f6-71da-458d-a2e0-becb528f937e","Type":"ContainerDied","Data":"204bf678b02a1a21bd67069d43c288bb0c8c4916d99736daf5b92a2cb9fc7880"} Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.196027 5025 scope.go:117] "RemoveContainer" containerID="a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.226086 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-s92q7"] Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.233534 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-s92q7"] Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.237902 5025 scope.go:117] "RemoveContainer" containerID="cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32" Oct 07 08:36:43 crc kubenswrapper[5025]: E1007 08:36:43.238341 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32\": container with ID starting with cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32 not found: ID does not exist" containerID="cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.238369 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32"} err="failed to get container status \"cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32\": rpc error: code = NotFound desc = could not find container \"cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32\": container with ID starting with cb3691f244de4ca0830a089b6123949af76c90d82ba99d1f25155836f8070d32 not found: ID does not exist" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.238390 5025 scope.go:117] "RemoveContainer" containerID="a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e" Oct 07 08:36:43 crc kubenswrapper[5025]: E1007 08:36:43.238779 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e\": container with ID starting with a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e not found: ID does not exist" containerID="a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.238795 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e"} err="failed to get container status \"a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e\": rpc error: code = NotFound desc = could not find container \"a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e\": container with ID starting with a9f0c9da89efd03e3fb2219e89987c9afaa09824f09fb6aa774118f1181cd71e not found: ID does not exist" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.238810 5025 scope.go:117] "RemoveContainer" containerID="5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.246455 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-config-data\") pod \"760b67f6-71da-458d-a2e0-becb528f937e\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.246622 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-nova-metadata-tls-certs\") pod \"760b67f6-71da-458d-a2e0-becb528f937e\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.247832 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-combined-ca-bundle\") pod \"760b67f6-71da-458d-a2e0-becb528f937e\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.247887 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p26gn\" (UniqueName: \"kubernetes.io/projected/760b67f6-71da-458d-a2e0-becb528f937e-kube-api-access-p26gn\") pod \"760b67f6-71da-458d-a2e0-becb528f937e\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.247970 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/760b67f6-71da-458d-a2e0-becb528f937e-logs\") pod \"760b67f6-71da-458d-a2e0-becb528f937e\" (UID: \"760b67f6-71da-458d-a2e0-becb528f937e\") " Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.248438 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760b67f6-71da-458d-a2e0-becb528f937e-logs" (OuterVolumeSpecName: "logs") pod "760b67f6-71da-458d-a2e0-becb528f937e" (UID: "760b67f6-71da-458d-a2e0-becb528f937e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.249387 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/760b67f6-71da-458d-a2e0-becb528f937e-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.250417 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760b67f6-71da-458d-a2e0-becb528f937e-kube-api-access-p26gn" (OuterVolumeSpecName: "kube-api-access-p26gn") pod "760b67f6-71da-458d-a2e0-becb528f937e" (UID: "760b67f6-71da-458d-a2e0-becb528f937e"). InnerVolumeSpecName "kube-api-access-p26gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.265891 5025 scope.go:117] "RemoveContainer" containerID="08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.273264 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-config-data" (OuterVolumeSpecName: "config-data") pod "760b67f6-71da-458d-a2e0-becb528f937e" (UID: "760b67f6-71da-458d-a2e0-becb528f937e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.278992 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "760b67f6-71da-458d-a2e0-becb528f937e" (UID: "760b67f6-71da-458d-a2e0-becb528f937e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.292296 5025 scope.go:117] "RemoveContainer" containerID="5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61" Oct 07 08:36:43 crc kubenswrapper[5025]: E1007 08:36:43.292795 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61\": container with ID starting with 5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61 not found: ID does not exist" containerID="5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.292837 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61"} err="failed to get container status \"5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61\": rpc error: code = NotFound desc = could not find container \"5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61\": container with ID starting with 5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61 not found: ID does not exist" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.292865 5025 scope.go:117] "RemoveContainer" containerID="08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238" Oct 07 08:36:43 crc kubenswrapper[5025]: E1007 08:36:43.293210 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238\": container with ID starting with 08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238 not found: ID does not exist" containerID="08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.293242 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238"} err="failed to get container status \"08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238\": rpc error: code = NotFound desc = could not find container \"08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238\": container with ID starting with 08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238 not found: ID does not exist" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.293262 5025 scope.go:117] "RemoveContainer" containerID="5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.293493 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61"} err="failed to get container status \"5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61\": rpc error: code = NotFound desc = could not find container \"5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61\": container with ID starting with 5c904895d034dbcf58f34cfded52dd056a1ac1809173cb93fd082841f9634a61 not found: ID does not exist" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.293515 5025 scope.go:117] "RemoveContainer" containerID="08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.294052 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238"} err="failed to get container status \"08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238\": rpc error: code = NotFound desc = could not find container \"08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238\": container with ID starting with 08b7bcdf5b056c6cd1550f785a8be0763160fc2703edbe2aa3ebb338811bf238 not found: ID does not exist" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.310411 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "760b67f6-71da-458d-a2e0-becb528f937e" (UID: "760b67f6-71da-458d-a2e0-becb528f937e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.351006 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.351065 5025 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.351078 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760b67f6-71da-458d-a2e0-becb528f937e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.351089 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p26gn\" (UniqueName: \"kubernetes.io/projected/760b67f6-71da-458d-a2e0-becb528f937e-kube-api-access-p26gn\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.552036 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.559983 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.581192 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:43 crc kubenswrapper[5025]: E1007 08:36:43.581633 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9263b87e-395f-4a9f-b819-846f1378bd73" containerName="init" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.581653 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9263b87e-395f-4a9f-b819-846f1378bd73" containerName="init" Oct 07 08:36:43 crc kubenswrapper[5025]: E1007 08:36:43.581671 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9263b87e-395f-4a9f-b819-846f1378bd73" containerName="dnsmasq-dns" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.581679 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9263b87e-395f-4a9f-b819-846f1378bd73" containerName="dnsmasq-dns" Oct 07 08:36:43 crc kubenswrapper[5025]: E1007 08:36:43.581725 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760b67f6-71da-458d-a2e0-becb528f937e" containerName="nova-metadata-metadata" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.581732 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="760b67f6-71da-458d-a2e0-becb528f937e" containerName="nova-metadata-metadata" Oct 07 08:36:43 crc kubenswrapper[5025]: E1007 08:36:43.581750 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760b67f6-71da-458d-a2e0-becb528f937e" containerName="nova-metadata-log" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.581756 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="760b67f6-71da-458d-a2e0-becb528f937e" containerName="nova-metadata-log" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.581931 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9263b87e-395f-4a9f-b819-846f1378bd73" containerName="dnsmasq-dns" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.581951 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="760b67f6-71da-458d-a2e0-becb528f937e" containerName="nova-metadata-log" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.581963 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="760b67f6-71da-458d-a2e0-becb528f937e" containerName="nova-metadata-metadata" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.582938 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.586366 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.586575 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.609265 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.758764 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd471e2-f72d-4cef-af5c-b5624faaae95-logs\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.758842 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xml5f\" (UniqueName: \"kubernetes.io/projected/fcd471e2-f72d-4cef-af5c-b5624faaae95-kube-api-access-xml5f\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.758949 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-config-data\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.758977 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.759035 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.860332 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-config-data\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.860674 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.860733 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.860789 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd471e2-f72d-4cef-af5c-b5624faaae95-logs\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.860825 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xml5f\" (UniqueName: \"kubernetes.io/projected/fcd471e2-f72d-4cef-af5c-b5624faaae95-kube-api-access-xml5f\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.861280 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd471e2-f72d-4cef-af5c-b5624faaae95-logs\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.864720 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-config-data\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.865227 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.865420 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.878052 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xml5f\" (UniqueName: \"kubernetes.io/projected/fcd471e2-f72d-4cef-af5c-b5624faaae95-kube-api-access-xml5f\") pod \"nova-metadata-0\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.900252 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.938950 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760b67f6-71da-458d-a2e0-becb528f937e" path="/var/lib/kubelet/pods/760b67f6-71da-458d-a2e0-becb528f937e/volumes" Oct 07 08:36:43 crc kubenswrapper[5025]: I1007 08:36:43.939793 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9263b87e-395f-4a9f-b819-846f1378bd73" path="/var/lib/kubelet/pods/9263b87e-395f-4a9f-b819-846f1378bd73/volumes" Oct 07 08:36:44 crc kubenswrapper[5025]: I1007 08:36:44.193596 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5db22b1d-c7bd-4bc9-b30b-3271da7741e1","Type":"ContainerStarted","Data":"fe88f33ff0949c6497d1f403bf9b543b6f9b6a8feba3d54578e33f486413a22f"} Oct 07 08:36:44 crc kubenswrapper[5025]: I1007 08:36:44.195933 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:44 crc kubenswrapper[5025]: I1007 08:36:44.200612 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="986412ee-2fb7-413a-a1d3-355bfc0ac0ec" containerName="nova-scheduler-scheduler" containerID="cri-o://ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5" gracePeriod=30 Oct 07 08:36:44 crc kubenswrapper[5025]: I1007 08:36:44.215074 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.215053554 podStartE2EDuration="2.215053554s" podCreationTimestamp="2025-10-07 08:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:44.207099483 +0000 UTC m=+1211.016413627" watchObservedRunningTime="2025-10-07 08:36:44.215053554 +0000 UTC m=+1211.024367698" Oct 07 08:36:44 crc kubenswrapper[5025]: W1007 08:36:44.373948 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcd471e2_f72d_4cef_af5c_b5624faaae95.slice/crio-bf300f47187f1d625c0fcce2082883114cad261aa9cc11730d93608e360b6a58 WatchSource:0}: Error finding container bf300f47187f1d625c0fcce2082883114cad261aa9cc11730d93608e360b6a58: Status 404 returned error can't find the container with id bf300f47187f1d625c0fcce2082883114cad261aa9cc11730d93608e360b6a58 Oct 07 08:36:44 crc kubenswrapper[5025]: I1007 08:36:44.381005 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:36:45 crc kubenswrapper[5025]: I1007 08:36:45.212168 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fcd471e2-f72d-4cef-af5c-b5624faaae95","Type":"ContainerStarted","Data":"68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c"} Oct 07 08:36:45 crc kubenswrapper[5025]: I1007 08:36:45.212518 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fcd471e2-f72d-4cef-af5c-b5624faaae95","Type":"ContainerStarted","Data":"d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd"} Oct 07 08:36:45 crc kubenswrapper[5025]: I1007 08:36:45.212533 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fcd471e2-f72d-4cef-af5c-b5624faaae95","Type":"ContainerStarted","Data":"bf300f47187f1d625c0fcce2082883114cad261aa9cc11730d93608e360b6a58"} Oct 07 08:36:45 crc kubenswrapper[5025]: I1007 08:36:45.239148 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.239123719 podStartE2EDuration="2.239123719s" podCreationTimestamp="2025-10-07 08:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:45.232682936 +0000 UTC m=+1212.041997110" watchObservedRunningTime="2025-10-07 08:36:45.239123719 +0000 UTC m=+1212.048437903" Oct 07 08:36:46 crc kubenswrapper[5025]: I1007 08:36:46.155594 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 08:36:46 crc kubenswrapper[5025]: E1007 08:36:46.721947 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 08:36:46 crc kubenswrapper[5025]: E1007 08:36:46.725741 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 08:36:46 crc kubenswrapper[5025]: E1007 08:36:46.728441 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 08:36:46 crc kubenswrapper[5025]: E1007 08:36:46.728531 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="986412ee-2fb7-413a-a1d3-355bfc0ac0ec" containerName="nova-scheduler-scheduler" Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.612130 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.743431 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m65rg\" (UniqueName: \"kubernetes.io/projected/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-kube-api-access-m65rg\") pod \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.743554 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-combined-ca-bundle\") pod \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.743815 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-config-data\") pod \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\" (UID: \"986412ee-2fb7-413a-a1d3-355bfc0ac0ec\") " Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.750170 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-kube-api-access-m65rg" (OuterVolumeSpecName: "kube-api-access-m65rg") pod "986412ee-2fb7-413a-a1d3-355bfc0ac0ec" (UID: "986412ee-2fb7-413a-a1d3-355bfc0ac0ec"). InnerVolumeSpecName "kube-api-access-m65rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.772593 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "986412ee-2fb7-413a-a1d3-355bfc0ac0ec" (UID: "986412ee-2fb7-413a-a1d3-355bfc0ac0ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.788741 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-config-data" (OuterVolumeSpecName: "config-data") pod "986412ee-2fb7-413a-a1d3-355bfc0ac0ec" (UID: "986412ee-2fb7-413a-a1d3-355bfc0ac0ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.845995 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.846030 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m65rg\" (UniqueName: \"kubernetes.io/projected/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-kube-api-access-m65rg\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:47 crc kubenswrapper[5025]: I1007 08:36:47.846041 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986412ee-2fb7-413a-a1d3-355bfc0ac0ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.239706 5025 generic.go:334] "Generic (PLEG): container finished" podID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerID="602618541b175c2a97eb8d75f28dbf882389261fee09f23050a909a8840e022e" exitCode=0 Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.239786 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411bbb25-18d9-4957-afff-2ebcd6866aeb","Type":"ContainerDied","Data":"602618541b175c2a97eb8d75f28dbf882389261fee09f23050a909a8840e022e"} Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.241795 5025 generic.go:334] "Generic (PLEG): container finished" podID="986412ee-2fb7-413a-a1d3-355bfc0ac0ec" containerID="ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5" exitCode=0 Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.241831 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"986412ee-2fb7-413a-a1d3-355bfc0ac0ec","Type":"ContainerDied","Data":"ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5"} Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.241858 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"986412ee-2fb7-413a-a1d3-355bfc0ac0ec","Type":"ContainerDied","Data":"b1f7dfe70c14471dfb55c2b7281e5e81f78aa681e8dd6f5578d5639199e5fac0"} Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.241871 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.241881 5025 scope.go:117] "RemoveContainer" containerID="ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.285011 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.288726 5025 scope.go:117] "RemoveContainer" containerID="ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5" Oct 07 08:36:48 crc kubenswrapper[5025]: E1007 08:36:48.289715 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5\": container with ID starting with ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5 not found: ID does not exist" containerID="ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.289752 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5"} err="failed to get container status \"ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5\": rpc error: code = NotFound desc = could not find container \"ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5\": container with ID starting with ac3a6ddf633bb321f14e05307ef692efffdb6eadfd68a3f22759a6e73282eca5 not found: ID does not exist" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.326524 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.339143 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:36:48 crc kubenswrapper[5025]: E1007 08:36:48.339651 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986412ee-2fb7-413a-a1d3-355bfc0ac0ec" containerName="nova-scheduler-scheduler" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.339668 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="986412ee-2fb7-413a-a1d3-355bfc0ac0ec" containerName="nova-scheduler-scheduler" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.339841 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="986412ee-2fb7-413a-a1d3-355bfc0ac0ec" containerName="nova-scheduler-scheduler" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.340489 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.342563 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.349191 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.392893 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.466350 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-config-data\") pod \"nova-scheduler-0\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.466409 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.466450 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sln42\" (UniqueName: \"kubernetes.io/projected/039af421-8cdd-4a32-bccc-f58f90b64fad-kube-api-access-sln42\") pod \"nova-scheduler-0\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.568304 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-config-data\") pod \"411bbb25-18d9-4957-afff-2ebcd6866aeb\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.568713 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzzmf\" (UniqueName: \"kubernetes.io/projected/411bbb25-18d9-4957-afff-2ebcd6866aeb-kube-api-access-hzzmf\") pod \"411bbb25-18d9-4957-afff-2ebcd6866aeb\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.568912 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-combined-ca-bundle\") pod \"411bbb25-18d9-4957-afff-2ebcd6866aeb\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.569099 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411bbb25-18d9-4957-afff-2ebcd6866aeb-logs\") pod \"411bbb25-18d9-4957-afff-2ebcd6866aeb\" (UID: \"411bbb25-18d9-4957-afff-2ebcd6866aeb\") " Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.569469 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-config-data\") pod \"nova-scheduler-0\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.569596 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.569733 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sln42\" (UniqueName: \"kubernetes.io/projected/039af421-8cdd-4a32-bccc-f58f90b64fad-kube-api-access-sln42\") pod \"nova-scheduler-0\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.569466 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/411bbb25-18d9-4957-afff-2ebcd6866aeb-logs" (OuterVolumeSpecName: "logs") pod "411bbb25-18d9-4957-afff-2ebcd6866aeb" (UID: "411bbb25-18d9-4957-afff-2ebcd6866aeb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.570261 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411bbb25-18d9-4957-afff-2ebcd6866aeb-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.573488 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.573701 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411bbb25-18d9-4957-afff-2ebcd6866aeb-kube-api-access-hzzmf" (OuterVolumeSpecName: "kube-api-access-hzzmf") pod "411bbb25-18d9-4957-afff-2ebcd6866aeb" (UID: "411bbb25-18d9-4957-afff-2ebcd6866aeb"). InnerVolumeSpecName "kube-api-access-hzzmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.574986 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-config-data\") pod \"nova-scheduler-0\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.599416 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sln42\" (UniqueName: \"kubernetes.io/projected/039af421-8cdd-4a32-bccc-f58f90b64fad-kube-api-access-sln42\") pod \"nova-scheduler-0\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.599826 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-config-data" (OuterVolumeSpecName: "config-data") pod "411bbb25-18d9-4957-afff-2ebcd6866aeb" (UID: "411bbb25-18d9-4957-afff-2ebcd6866aeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.600022 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "411bbb25-18d9-4957-afff-2ebcd6866aeb" (UID: "411bbb25-18d9-4957-afff-2ebcd6866aeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.672522 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.672827 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzzmf\" (UniqueName: \"kubernetes.io/projected/411bbb25-18d9-4957-afff-2ebcd6866aeb-kube-api-access-hzzmf\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.672842 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411bbb25-18d9-4957-afff-2ebcd6866aeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.708303 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.900792 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 08:36:48 crc kubenswrapper[5025]: I1007 08:36:48.901651 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.250518 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.252676 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411bbb25-18d9-4957-afff-2ebcd6866aeb","Type":"ContainerDied","Data":"d2343f881e3de627c2f1f3c15ddcae177179a9aa6bca574588fbd20cc28a1fd3"} Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.252725 5025 scope.go:117] "RemoveContainer" containerID="602618541b175c2a97eb8d75f28dbf882389261fee09f23050a909a8840e022e" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.252847 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: W1007 08:36:49.262974 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod039af421_8cdd_4a32_bccc_f58f90b64fad.slice/crio-a0aff233a1acb54b1c008f8526b94eefa0e5ddd8b8d72844ee1f7d63f16db7f6 WatchSource:0}: Error finding container a0aff233a1acb54b1c008f8526b94eefa0e5ddd8b8d72844ee1f7d63f16db7f6: Status 404 returned error can't find the container with id a0aff233a1acb54b1c008f8526b94eefa0e5ddd8b8d72844ee1f7d63f16db7f6 Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.287607 5025 scope.go:117] "RemoveContainer" containerID="29cbed0d4183e69c462b96688d81104df2f4003866f56cfcfbd6f94622bd80c1" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.463456 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.481426 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.495240 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 08:36:49 crc kubenswrapper[5025]: E1007 08:36:49.496433 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-log" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.496482 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-log" Oct 07 08:36:49 crc kubenswrapper[5025]: E1007 08:36:49.496510 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-api" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.496520 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-api" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.496825 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-api" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.496878 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" containerName="nova-api-log" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.498285 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.504052 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.508772 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.592890 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.593167 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c3975-f5ce-4bdd-947f-dd79b2b66e57-logs\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.593218 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-config-data\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.593250 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw6qg\" (UniqueName: \"kubernetes.io/projected/622c3975-f5ce-4bdd-947f-dd79b2b66e57-kube-api-access-mw6qg\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.694697 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.694741 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c3975-f5ce-4bdd-947f-dd79b2b66e57-logs\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.694799 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-config-data\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.694837 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw6qg\" (UniqueName: \"kubernetes.io/projected/622c3975-f5ce-4bdd-947f-dd79b2b66e57-kube-api-access-mw6qg\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.695316 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c3975-f5ce-4bdd-947f-dd79b2b66e57-logs\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.700226 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-config-data\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.706145 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.716093 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw6qg\" (UniqueName: \"kubernetes.io/projected/622c3975-f5ce-4bdd-947f-dd79b2b66e57-kube-api-access-mw6qg\") pod \"nova-api-0\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.820297 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.904911 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.905149 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="77ea4bce-e477-4798-923e-ce17548441d6" containerName="kube-state-metrics" containerID="cri-o://a2079e2c81626912c745624ab9276c37b22464ca45f2a2017cb7b9516c2562e6" gracePeriod=30 Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.927957 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411bbb25-18d9-4957-afff-2ebcd6866aeb" path="/var/lib/kubelet/pods/411bbb25-18d9-4957-afff-2ebcd6866aeb/volumes" Oct 07 08:36:49 crc kubenswrapper[5025]: I1007 08:36:49.928869 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="986412ee-2fb7-413a-a1d3-355bfc0ac0ec" path="/var/lib/kubelet/pods/986412ee-2fb7-413a-a1d3-355bfc0ac0ec/volumes" Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.273795 5025 generic.go:334] "Generic (PLEG): container finished" podID="77ea4bce-e477-4798-923e-ce17548441d6" containerID="a2079e2c81626912c745624ab9276c37b22464ca45f2a2017cb7b9516c2562e6" exitCode=2 Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.273859 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77ea4bce-e477-4798-923e-ce17548441d6","Type":"ContainerDied","Data":"a2079e2c81626912c745624ab9276c37b22464ca45f2a2017cb7b9516c2562e6"} Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.279272 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"039af421-8cdd-4a32-bccc-f58f90b64fad","Type":"ContainerStarted","Data":"a338657f8f197d0f0f88660e3685ef3ce6e0e74f42231157db1d2b33a939958a"} Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.279321 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"039af421-8cdd-4a32-bccc-f58f90b64fad","Type":"ContainerStarted","Data":"a0aff233a1acb54b1c008f8526b94eefa0e5ddd8b8d72844ee1f7d63f16db7f6"} Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.315613 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.325420 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.325401166 podStartE2EDuration="2.325401166s" podCreationTimestamp="2025-10-07 08:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:50.299446958 +0000 UTC m=+1217.108761102" watchObservedRunningTime="2025-10-07 08:36:50.325401166 +0000 UTC m=+1217.134715310" Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.466916 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.612165 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nknls\" (UniqueName: \"kubernetes.io/projected/77ea4bce-e477-4798-923e-ce17548441d6-kube-api-access-nknls\") pod \"77ea4bce-e477-4798-923e-ce17548441d6\" (UID: \"77ea4bce-e477-4798-923e-ce17548441d6\") " Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.616918 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ea4bce-e477-4798-923e-ce17548441d6-kube-api-access-nknls" (OuterVolumeSpecName: "kube-api-access-nknls") pod "77ea4bce-e477-4798-923e-ce17548441d6" (UID: "77ea4bce-e477-4798-923e-ce17548441d6"). InnerVolumeSpecName "kube-api-access-nknls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:50 crc kubenswrapper[5025]: I1007 08:36:50.715616 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nknls\" (UniqueName: \"kubernetes.io/projected/77ea4bce-e477-4798-923e-ce17548441d6-kube-api-access-nknls\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.297139 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.297142 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77ea4bce-e477-4798-923e-ce17548441d6","Type":"ContainerDied","Data":"d4bdc36363937919e754b21a3df0cb1680442793d62910b32e3e2b66f02b7441"} Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.297595 5025 scope.go:117] "RemoveContainer" containerID="a2079e2c81626912c745624ab9276c37b22464ca45f2a2017cb7b9516c2562e6" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.304667 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"622c3975-f5ce-4bdd-947f-dd79b2b66e57","Type":"ContainerStarted","Data":"10b9f5700cc4e24590cffcbcde7cac0af1a3631605fcea78d34ab2f307f7a30d"} Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.304718 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"622c3975-f5ce-4bdd-947f-dd79b2b66e57","Type":"ContainerStarted","Data":"9e3b82bfc7c4a84f47445e6f6f09d41612896aabe03608fa33cd4c6d834bc03a"} Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.304734 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"622c3975-f5ce-4bdd-947f-dd79b2b66e57","Type":"ContainerStarted","Data":"7911e093a3abc7ec316831ff445f1b689ebcc359a0f42b777be92a9dba9eb46c"} Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.330112 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.330095809 podStartE2EDuration="2.330095809s" podCreationTimestamp="2025-10-07 08:36:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:36:51.32853346 +0000 UTC m=+1218.137847604" watchObservedRunningTime="2025-10-07 08:36:51.330095809 +0000 UTC m=+1218.139409953" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.350975 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.359805 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.369841 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:36:51 crc kubenswrapper[5025]: E1007 08:36:51.370537 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ea4bce-e477-4798-923e-ce17548441d6" containerName="kube-state-metrics" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.370603 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ea4bce-e477-4798-923e-ce17548441d6" containerName="kube-state-metrics" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.371047 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ea4bce-e477-4798-923e-ce17548441d6" containerName="kube-state-metrics" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.372328 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.387273 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.388497 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.389909 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.428910 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skp6n\" (UniqueName: \"kubernetes.io/projected/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-api-access-skp6n\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.429143 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.429273 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.429319 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.531601 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.531711 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.531832 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skp6n\" (UniqueName: \"kubernetes.io/projected/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-api-access-skp6n\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.531914 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.535158 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.535815 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.536417 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.570081 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skp6n\" (UniqueName: \"kubernetes.io/projected/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-api-access-skp6n\") pod \"kube-state-metrics-0\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.708802 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.932316 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ea4bce-e477-4798-923e-ce17548441d6" path="/var/lib/kubelet/pods/77ea4bce-e477-4798-923e-ce17548441d6/volumes" Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.993937 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.994348 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="ceilometer-central-agent" containerID="cri-o://cc237635d00ad9e0a790b76dd95c965f8e0d885520cc51472c85d97b6c821f6b" gracePeriod=30 Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.994430 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="proxy-httpd" containerID="cri-o://000033ec05e66e7adf5b84812f5d428fb36ae5b59275f223f7b9bebc0182fb3a" gracePeriod=30 Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.994510 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="ceilometer-notification-agent" containerID="cri-o://35d788691eaf6fda609dd639420adfaf4c89b7b6bff7fc549bcfaf92048a81ba" gracePeriod=30 Oct 07 08:36:51 crc kubenswrapper[5025]: I1007 08:36:51.994490 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="sg-core" containerID="cri-o://ab1dd94d7eba0c12be53b0445a4a5179105e5685deead257e4e717e9f31175d0" gracePeriod=30 Oct 07 08:36:52 crc kubenswrapper[5025]: I1007 08:36:52.205181 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:36:52 crc kubenswrapper[5025]: W1007 08:36:52.207655 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod732b0bbc_f02e_4ced_9230_af95c9654b93.slice/crio-0cf91794b54f425de7d8916e93b107a79f49429b97a38621de3fc575a86d9fb9 WatchSource:0}: Error finding container 0cf91794b54f425de7d8916e93b107a79f49429b97a38621de3fc575a86d9fb9: Status 404 returned error can't find the container with id 0cf91794b54f425de7d8916e93b107a79f49429b97a38621de3fc575a86d9fb9 Oct 07 08:36:52 crc kubenswrapper[5025]: I1007 08:36:52.319676 5025 generic.go:334] "Generic (PLEG): container finished" podID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerID="000033ec05e66e7adf5b84812f5d428fb36ae5b59275f223f7b9bebc0182fb3a" exitCode=0 Oct 07 08:36:52 crc kubenswrapper[5025]: I1007 08:36:52.319722 5025 generic.go:334] "Generic (PLEG): container finished" podID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerID="ab1dd94d7eba0c12be53b0445a4a5179105e5685deead257e4e717e9f31175d0" exitCode=2 Oct 07 08:36:52 crc kubenswrapper[5025]: I1007 08:36:52.319779 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerDied","Data":"000033ec05e66e7adf5b84812f5d428fb36ae5b59275f223f7b9bebc0182fb3a"} Oct 07 08:36:52 crc kubenswrapper[5025]: I1007 08:36:52.319816 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerDied","Data":"ab1dd94d7eba0c12be53b0445a4a5179105e5685deead257e4e717e9f31175d0"} Oct 07 08:36:52 crc kubenswrapper[5025]: I1007 08:36:52.324170 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"732b0bbc-f02e-4ced-9230-af95c9654b93","Type":"ContainerStarted","Data":"0cf91794b54f425de7d8916e93b107a79f49429b97a38621de3fc575a86d9fb9"} Oct 07 08:36:52 crc kubenswrapper[5025]: I1007 08:36:52.655799 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 08:36:53 crc kubenswrapper[5025]: I1007 08:36:53.333333 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"732b0bbc-f02e-4ced-9230-af95c9654b93","Type":"ContainerStarted","Data":"48e5ab1102a8da55abb1bed760339663a999e777637527039fd43637c874b375"} Oct 07 08:36:53 crc kubenswrapper[5025]: I1007 08:36:53.333866 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 08:36:53 crc kubenswrapper[5025]: I1007 08:36:53.337416 5025 generic.go:334] "Generic (PLEG): container finished" podID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerID="cc237635d00ad9e0a790b76dd95c965f8e0d885520cc51472c85d97b6c821f6b" exitCode=0 Oct 07 08:36:53 crc kubenswrapper[5025]: I1007 08:36:53.337460 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerDied","Data":"cc237635d00ad9e0a790b76dd95c965f8e0d885520cc51472c85d97b6c821f6b"} Oct 07 08:36:53 crc kubenswrapper[5025]: I1007 08:36:53.355217 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.953493478 podStartE2EDuration="2.355197123s" podCreationTimestamp="2025-10-07 08:36:51 +0000 UTC" firstStartedPulling="2025-10-07 08:36:52.210935652 +0000 UTC m=+1219.020249796" lastFinishedPulling="2025-10-07 08:36:52.612639297 +0000 UTC m=+1219.421953441" observedRunningTime="2025-10-07 08:36:53.350834935 +0000 UTC m=+1220.160149099" watchObservedRunningTime="2025-10-07 08:36:53.355197123 +0000 UTC m=+1220.164511277" Oct 07 08:36:53 crc kubenswrapper[5025]: I1007 08:36:53.708884 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 08:36:53 crc kubenswrapper[5025]: I1007 08:36:53.901357 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 08:36:53 crc kubenswrapper[5025]: I1007 08:36:53.901416 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.356648 5025 generic.go:334] "Generic (PLEG): container finished" podID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerID="35d788691eaf6fda609dd639420adfaf4c89b7b6bff7fc549bcfaf92048a81ba" exitCode=0 Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.356712 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerDied","Data":"35d788691eaf6fda609dd639420adfaf4c89b7b6bff7fc549bcfaf92048a81ba"} Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.605507 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.690730 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-log-httpd\") pod \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.690864 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-scripts\") pod \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.690895 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-run-httpd\") pod \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.690950 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvbh4\" (UniqueName: \"kubernetes.io/projected/6b83b7d6-5d79-41e1-9c95-555d7bbef815-kube-api-access-bvbh4\") pod \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.691064 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-sg-core-conf-yaml\") pod \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.691124 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-combined-ca-bundle\") pod \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.691190 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-config-data\") pod \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\" (UID: \"6b83b7d6-5d79-41e1-9c95-555d7bbef815\") " Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.691463 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6b83b7d6-5d79-41e1-9c95-555d7bbef815" (UID: "6b83b7d6-5d79-41e1-9c95-555d7bbef815"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.691736 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.692526 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6b83b7d6-5d79-41e1-9c95-555d7bbef815" (UID: "6b83b7d6-5d79-41e1-9c95-555d7bbef815"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.726667 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-scripts" (OuterVolumeSpecName: "scripts") pod "6b83b7d6-5d79-41e1-9c95-555d7bbef815" (UID: "6b83b7d6-5d79-41e1-9c95-555d7bbef815"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.726738 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6b83b7d6-5d79-41e1-9c95-555d7bbef815" (UID: "6b83b7d6-5d79-41e1-9c95-555d7bbef815"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.753582 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b83b7d6-5d79-41e1-9c95-555d7bbef815-kube-api-access-bvbh4" (OuterVolumeSpecName: "kube-api-access-bvbh4") pod "6b83b7d6-5d79-41e1-9c95-555d7bbef815" (UID: "6b83b7d6-5d79-41e1-9c95-555d7bbef815"). InnerVolumeSpecName "kube-api-access-bvbh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.818707 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b83b7d6-5d79-41e1-9c95-555d7bbef815-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.818784 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.818796 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvbh4\" (UniqueName: \"kubernetes.io/projected/6b83b7d6-5d79-41e1-9c95-555d7bbef815-kube-api-access-bvbh4\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.818807 5025 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.841025 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b83b7d6-5d79-41e1-9c95-555d7bbef815" (UID: "6b83b7d6-5d79-41e1-9c95-555d7bbef815"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.856985 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-config-data" (OuterVolumeSpecName: "config-data") pod "6b83b7d6-5d79-41e1-9c95-555d7bbef815" (UID: "6b83b7d6-5d79-41e1-9c95-555d7bbef815"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.913756 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.913895 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.920196 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:54 crc kubenswrapper[5025]: I1007 08:36:54.920234 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b83b7d6-5d79-41e1-9c95-555d7bbef815-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.204688 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="77ea4bce-e477-4798-923e-ce17548441d6" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.368307 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b83b7d6-5d79-41e1-9c95-555d7bbef815","Type":"ContainerDied","Data":"916e7c9ca564ec26075792693629c0d25a9ff3146f828656d4b2834f0099be0b"} Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.368357 5025 scope.go:117] "RemoveContainer" containerID="000033ec05e66e7adf5b84812f5d428fb36ae5b59275f223f7b9bebc0182fb3a" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.368389 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.392133 5025 scope.go:117] "RemoveContainer" containerID="ab1dd94d7eba0c12be53b0445a4a5179105e5685deead257e4e717e9f31175d0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.403165 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.418074 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.423367 5025 scope.go:117] "RemoveContainer" containerID="35d788691eaf6fda609dd639420adfaf4c89b7b6bff7fc549bcfaf92048a81ba" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.441968 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:55 crc kubenswrapper[5025]: E1007 08:36:55.443052 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="ceilometer-notification-agent" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.443072 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="ceilometer-notification-agent" Oct 07 08:36:55 crc kubenswrapper[5025]: E1007 08:36:55.443096 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="proxy-httpd" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.443102 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="proxy-httpd" Oct 07 08:36:55 crc kubenswrapper[5025]: E1007 08:36:55.443116 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="sg-core" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.443122 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="sg-core" Oct 07 08:36:55 crc kubenswrapper[5025]: E1007 08:36:55.443146 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="ceilometer-central-agent" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.443152 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="ceilometer-central-agent" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.443327 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="sg-core" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.443340 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="ceilometer-central-agent" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.443354 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="proxy-httpd" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.443364 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" containerName="ceilometer-notification-agent" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.445155 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.449858 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.455770 5025 scope.go:117] "RemoveContainer" containerID="cc237635d00ad9e0a790b76dd95c965f8e0d885520cc51472c85d97b6c821f6b" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.456197 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.456387 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.480475 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.528740 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.528778 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.528830 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-log-httpd\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.528858 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.528899 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-run-httpd\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.528930 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-scripts\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.528955 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-config-data\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.528993 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmfps\" (UniqueName: \"kubernetes.io/projected/71b56f8c-4382-4fe6-9674-5b73d51c53ee-kube-api-access-kmfps\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.630255 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-scripts\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.630312 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-config-data\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.630360 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmfps\" (UniqueName: \"kubernetes.io/projected/71b56f8c-4382-4fe6-9674-5b73d51c53ee-kube-api-access-kmfps\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.630392 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.630405 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.630448 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-log-httpd\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.630475 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.630516 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-run-httpd\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.630915 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-run-httpd\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.632078 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-log-httpd\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.635721 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.635775 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.635959 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-scripts\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.637095 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-config-data\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.637110 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.650503 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmfps\" (UniqueName: \"kubernetes.io/projected/71b56f8c-4382-4fe6-9674-5b73d51c53ee-kube-api-access-kmfps\") pod \"ceilometer-0\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.808118 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.930826 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b83b7d6-5d79-41e1-9c95-555d7bbef815" path="/var/lib/kubelet/pods/6b83b7d6-5d79-41e1-9c95-555d7bbef815/volumes" Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.934251 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:36:55 crc kubenswrapper[5025]: I1007 08:36:55.934299 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:36:56 crc kubenswrapper[5025]: I1007 08:36:56.275185 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:36:56 crc kubenswrapper[5025]: I1007 08:36:56.380315 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerStarted","Data":"14b30d6c609e55edb2d6fb8fd95edd0b090fdf5bf6d0b1d88416230cb8e9777a"} Oct 07 08:36:57 crc kubenswrapper[5025]: I1007 08:36:57.392464 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerStarted","Data":"a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70"} Oct 07 08:36:58 crc kubenswrapper[5025]: I1007 08:36:58.406951 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerStarted","Data":"6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac"} Oct 07 08:36:58 crc kubenswrapper[5025]: I1007 08:36:58.709373 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 08:36:58 crc kubenswrapper[5025]: I1007 08:36:58.739991 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 08:36:59 crc kubenswrapper[5025]: I1007 08:36:59.420594 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerStarted","Data":"a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50"} Oct 07 08:36:59 crc kubenswrapper[5025]: I1007 08:36:59.458395 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 08:36:59 crc kubenswrapper[5025]: I1007 08:36:59.821586 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 08:36:59 crc kubenswrapper[5025]: I1007 08:36:59.821626 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 08:37:00 crc kubenswrapper[5025]: I1007 08:37:00.905800 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 08:37:00 crc kubenswrapper[5025]: I1007 08:37:00.905849 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 08:37:01 crc kubenswrapper[5025]: I1007 08:37:01.447963 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerStarted","Data":"b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7"} Oct 07 08:37:01 crc kubenswrapper[5025]: I1007 08:37:01.448478 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 08:37:01 crc kubenswrapper[5025]: I1007 08:37:01.469421 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.482158285 podStartE2EDuration="6.469401438s" podCreationTimestamp="2025-10-07 08:36:55 +0000 UTC" firstStartedPulling="2025-10-07 08:36:56.292432002 +0000 UTC m=+1223.101746136" lastFinishedPulling="2025-10-07 08:37:00.279675145 +0000 UTC m=+1227.088989289" observedRunningTime="2025-10-07 08:37:01.465296659 +0000 UTC m=+1228.274610803" watchObservedRunningTime="2025-10-07 08:37:01.469401438 +0000 UTC m=+1228.278715582" Oct 07 08:37:01 crc kubenswrapper[5025]: I1007 08:37:01.732975 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 08:37:03 crc kubenswrapper[5025]: I1007 08:37:03.907133 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 08:37:03 crc kubenswrapper[5025]: I1007 08:37:03.941401 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 08:37:03 crc kubenswrapper[5025]: I1007 08:37:03.941588 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 08:37:04 crc kubenswrapper[5025]: I1007 08:37:04.484121 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.500402 5025 generic.go:334] "Generic (PLEG): container finished" podID="1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22" containerID="77f7516d7962feba3cc7f2520d60c9bad2c5da055f1c07185c88455c01f36edc" exitCode=137 Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.500490 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22","Type":"ContainerDied","Data":"77f7516d7962feba3cc7f2520d60c9bad2c5da055f1c07185c88455c01f36edc"} Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.501192 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22","Type":"ContainerDied","Data":"9cac546dc494bee91d7275ac53226279be24c00e356c092d63a690718ac1bc2c"} Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.501228 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cac546dc494bee91d7275ac53226279be24c00e356c092d63a690718ac1bc2c" Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.510243 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.652910 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25k9v\" (UniqueName: \"kubernetes.io/projected/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-kube-api-access-25k9v\") pod \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.653173 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-config-data\") pod \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.653322 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-combined-ca-bundle\") pod \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\" (UID: \"1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22\") " Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.703863 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-kube-api-access-25k9v" (OuterVolumeSpecName: "kube-api-access-25k9v") pod "1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22" (UID: "1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22"). InnerVolumeSpecName "kube-api-access-25k9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.736843 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-config-data" (OuterVolumeSpecName: "config-data") pod "1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22" (UID: "1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.745722 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22" (UID: "1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.756096 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25k9v\" (UniqueName: \"kubernetes.io/projected/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-kube-api-access-25k9v\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.756146 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:06 crc kubenswrapper[5025]: I1007 08:37:06.756159 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.508706 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.548797 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.557633 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.581664 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:37:07 crc kubenswrapper[5025]: E1007 08:37:07.582150 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.582171 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.582462 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.583275 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.586521 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.587156 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.593642 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.609375 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.674464 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.674679 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.674887 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.675213 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.675300 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xg6s\" (UniqueName: \"kubernetes.io/projected/8f770079-7ba4-4076-bd29-15fab31fc53d-kube-api-access-7xg6s\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.777575 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.777624 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.777692 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.777723 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xg6s\" (UniqueName: \"kubernetes.io/projected/8f770079-7ba4-4076-bd29-15fab31fc53d-kube-api-access-7xg6s\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.777765 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.781322 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.781721 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.782033 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.782747 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.811103 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xg6s\" (UniqueName: \"kubernetes.io/projected/8f770079-7ba4-4076-bd29-15fab31fc53d-kube-api-access-7xg6s\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.907821 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:07 crc kubenswrapper[5025]: I1007 08:37:07.938267 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22" path="/var/lib/kubelet/pods/1f8523c4-9270-4e9d-bb5c-2aa4bfef0f22/volumes" Oct 07 08:37:08 crc kubenswrapper[5025]: I1007 08:37:08.185010 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:37:08 crc kubenswrapper[5025]: I1007 08:37:08.516704 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f770079-7ba4-4076-bd29-15fab31fc53d","Type":"ContainerStarted","Data":"c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886"} Oct 07 08:37:08 crc kubenswrapper[5025]: I1007 08:37:08.517024 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f770079-7ba4-4076-bd29-15fab31fc53d","Type":"ContainerStarted","Data":"c3c68c6543e6491e76f0bbb26f233e8307132662027970ec301e4a78036bcbab"} Oct 07 08:37:08 crc kubenswrapper[5025]: I1007 08:37:08.534756 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.534731914 podStartE2EDuration="1.534731914s" podCreationTimestamp="2025-10-07 08:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:37:08.530106888 +0000 UTC m=+1235.339421022" watchObservedRunningTime="2025-10-07 08:37:08.534731914 +0000 UTC m=+1235.344046068" Oct 07 08:37:09 crc kubenswrapper[5025]: I1007 08:37:09.825025 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 08:37:09 crc kubenswrapper[5025]: I1007 08:37:09.825590 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 08:37:09 crc kubenswrapper[5025]: I1007 08:37:09.829375 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 08:37:09 crc kubenswrapper[5025]: I1007 08:37:09.830209 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.538060 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.544195 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.761335 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-lmq8k"] Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.763285 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.785636 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-lmq8k"] Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.843073 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.843200 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-config\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.843249 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.843310 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frr77\" (UniqueName: \"kubernetes.io/projected/219d3031-42b0-40df-98fc-46fe548a8f39-kube-api-access-frr77\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.843388 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.843417 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.945294 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.945406 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-config\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.945452 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.945483 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frr77\" (UniqueName: \"kubernetes.io/projected/219d3031-42b0-40df-98fc-46fe548a8f39-kube-api-access-frr77\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.945528 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.945575 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.946452 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.946503 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-config\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.946685 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.946921 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.946921 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:10 crc kubenswrapper[5025]: I1007 08:37:10.974820 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frr77\" (UniqueName: \"kubernetes.io/projected/219d3031-42b0-40df-98fc-46fe548a8f39-kube-api-access-frr77\") pod \"dnsmasq-dns-5c7b6c5df9-lmq8k\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:11 crc kubenswrapper[5025]: I1007 08:37:11.087444 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:11 crc kubenswrapper[5025]: I1007 08:37:11.799250 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-lmq8k"] Oct 07 08:37:11 crc kubenswrapper[5025]: W1007 08:37:11.804428 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod219d3031_42b0_40df_98fc_46fe548a8f39.slice/crio-a219c71588c1e0ad27128812d0496c6d2176b5c5102a180c840290910bb56f16 WatchSource:0}: Error finding container a219c71588c1e0ad27128812d0496c6d2176b5c5102a180c840290910bb56f16: Status 404 returned error can't find the container with id a219c71588c1e0ad27128812d0496c6d2176b5c5102a180c840290910bb56f16 Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.556494 5025 generic.go:334] "Generic (PLEG): container finished" podID="219d3031-42b0-40df-98fc-46fe548a8f39" containerID="abb39eedae6ddb7273b901a428a0238e8dd161207695fb279307852070147dbe" exitCode=0 Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.558241 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" event={"ID":"219d3031-42b0-40df-98fc-46fe548a8f39","Type":"ContainerDied","Data":"abb39eedae6ddb7273b901a428a0238e8dd161207695fb279307852070147dbe"} Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.558278 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" event={"ID":"219d3031-42b0-40df-98fc-46fe548a8f39","Type":"ContainerStarted","Data":"a219c71588c1e0ad27128812d0496c6d2176b5c5102a180c840290910bb56f16"} Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.835490 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.835834 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="ceilometer-central-agent" containerID="cri-o://a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70" gracePeriod=30 Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.836258 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="sg-core" containerID="cri-o://a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50" gracePeriod=30 Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.836268 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="proxy-httpd" containerID="cri-o://b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7" gracePeriod=30 Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.836356 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="ceilometer-notification-agent" containerID="cri-o://6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac" gracePeriod=30 Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.851348 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.196:3000/\": read tcp 10.217.0.2:41268->10.217.0.196:3000: read: connection reset by peer" Oct 07 08:37:12 crc kubenswrapper[5025]: I1007 08:37:12.908151 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:13 crc kubenswrapper[5025]: I1007 08:37:13.569968 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" event={"ID":"219d3031-42b0-40df-98fc-46fe548a8f39","Type":"ContainerStarted","Data":"22e9ac307d801dfcf026320bb8d6652c6f32dae3bd30c0d3a5d41209d16ed0b1"} Oct 07 08:37:13 crc kubenswrapper[5025]: I1007 08:37:13.570457 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:13 crc kubenswrapper[5025]: I1007 08:37:13.574806 5025 generic.go:334] "Generic (PLEG): container finished" podID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerID="b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7" exitCode=0 Oct 07 08:37:13 crc kubenswrapper[5025]: I1007 08:37:13.574859 5025 generic.go:334] "Generic (PLEG): container finished" podID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerID="a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50" exitCode=2 Oct 07 08:37:13 crc kubenswrapper[5025]: I1007 08:37:13.574854 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerDied","Data":"b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7"} Oct 07 08:37:13 crc kubenswrapper[5025]: I1007 08:37:13.574904 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerDied","Data":"a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50"} Oct 07 08:37:13 crc kubenswrapper[5025]: I1007 08:37:13.574919 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerDied","Data":"a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70"} Oct 07 08:37:13 crc kubenswrapper[5025]: I1007 08:37:13.574872 5025 generic.go:334] "Generic (PLEG): container finished" podID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerID="a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70" exitCode=0 Oct 07 08:37:13 crc kubenswrapper[5025]: I1007 08:37:13.599587 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" podStartSLOduration=3.599561955 podStartE2EDuration="3.599561955s" podCreationTimestamp="2025-10-07 08:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:37:13.599001287 +0000 UTC m=+1240.408315431" watchObservedRunningTime="2025-10-07 08:37:13.599561955 +0000 UTC m=+1240.408876099" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.088857 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.089496 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-log" containerID="cri-o://9e3b82bfc7c4a84f47445e6f6f09d41612896aabe03608fa33cd4c6d834bc03a" gracePeriod=30 Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.089966 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-api" containerID="cri-o://10b9f5700cc4e24590cffcbcde7cac0af1a3631605fcea78d34ab2f307f7a30d" gracePeriod=30 Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.482101 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.599424 5025 generic.go:334] "Generic (PLEG): container finished" podID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerID="6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac" exitCode=0 Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.599484 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.599520 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerDied","Data":"6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac"} Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.599595 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b56f8c-4382-4fe6-9674-5b73d51c53ee","Type":"ContainerDied","Data":"14b30d6c609e55edb2d6fb8fd95edd0b090fdf5bf6d0b1d88416230cb8e9777a"} Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.599616 5025 scope.go:117] "RemoveContainer" containerID="b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.616484 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmfps\" (UniqueName: \"kubernetes.io/projected/71b56f8c-4382-4fe6-9674-5b73d51c53ee-kube-api-access-kmfps\") pod \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.616530 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-config-data\") pod \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.616641 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-scripts\") pod \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.616716 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-sg-core-conf-yaml\") pod \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.616792 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-ceilometer-tls-certs\") pod \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.616811 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-run-httpd\") pod \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.616836 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-combined-ca-bundle\") pod \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.616857 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-log-httpd\") pod \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\" (UID: \"71b56f8c-4382-4fe6-9674-5b73d51c53ee\") " Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.617701 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "71b56f8c-4382-4fe6-9674-5b73d51c53ee" (UID: "71b56f8c-4382-4fe6-9674-5b73d51c53ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.618829 5025 generic.go:334] "Generic (PLEG): container finished" podID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerID="9e3b82bfc7c4a84f47445e6f6f09d41612896aabe03608fa33cd4c6d834bc03a" exitCode=143 Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.620016 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"622c3975-f5ce-4bdd-947f-dd79b2b66e57","Type":"ContainerDied","Data":"9e3b82bfc7c4a84f47445e6f6f09d41612896aabe03608fa33cd4c6d834bc03a"} Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.620376 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "71b56f8c-4382-4fe6-9674-5b73d51c53ee" (UID: "71b56f8c-4382-4fe6-9674-5b73d51c53ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.648824 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b56f8c-4382-4fe6-9674-5b73d51c53ee-kube-api-access-kmfps" (OuterVolumeSpecName: "kube-api-access-kmfps") pod "71b56f8c-4382-4fe6-9674-5b73d51c53ee" (UID: "71b56f8c-4382-4fe6-9674-5b73d51c53ee"). InnerVolumeSpecName "kube-api-access-kmfps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.651658 5025 scope.go:117] "RemoveContainer" containerID="a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.672743 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-scripts" (OuterVolumeSpecName: "scripts") pod "71b56f8c-4382-4fe6-9674-5b73d51c53ee" (UID: "71b56f8c-4382-4fe6-9674-5b73d51c53ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.715445 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "71b56f8c-4382-4fe6-9674-5b73d51c53ee" (UID: "71b56f8c-4382-4fe6-9674-5b73d51c53ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.718466 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmfps\" (UniqueName: \"kubernetes.io/projected/71b56f8c-4382-4fe6-9674-5b73d51c53ee-kube-api-access-kmfps\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.718489 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.718497 5025 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.718506 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.718514 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b56f8c-4382-4fe6-9674-5b73d51c53ee-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.719496 5025 scope.go:117] "RemoveContainer" containerID="6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.741730 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "71b56f8c-4382-4fe6-9674-5b73d51c53ee" (UID: "71b56f8c-4382-4fe6-9674-5b73d51c53ee"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.758986 5025 scope.go:117] "RemoveContainer" containerID="a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.779334 5025 scope.go:117] "RemoveContainer" containerID="b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7" Oct 07 08:37:14 crc kubenswrapper[5025]: E1007 08:37:14.779903 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7\": container with ID starting with b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7 not found: ID does not exist" containerID="b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.779945 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7"} err="failed to get container status \"b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7\": rpc error: code = NotFound desc = could not find container \"b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7\": container with ID starting with b83d1bf9130aa931ba528b73d1796e4102274982de0df60d07dcadf08176fad7 not found: ID does not exist" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.779973 5025 scope.go:117] "RemoveContainer" containerID="a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50" Oct 07 08:37:14 crc kubenswrapper[5025]: E1007 08:37:14.780324 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50\": container with ID starting with a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50 not found: ID does not exist" containerID="a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.780342 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50"} err="failed to get container status \"a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50\": rpc error: code = NotFound desc = could not find container \"a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50\": container with ID starting with a70329cb4a559292435c124ba397894b1132f5e77d37831af09193acb0c17a50 not found: ID does not exist" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.780355 5025 scope.go:117] "RemoveContainer" containerID="6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac" Oct 07 08:37:14 crc kubenswrapper[5025]: E1007 08:37:14.780718 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac\": container with ID starting with 6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac not found: ID does not exist" containerID="6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.780736 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac"} err="failed to get container status \"6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac\": rpc error: code = NotFound desc = could not find container \"6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac\": container with ID starting with 6d3228ad66b66184fca6eeaac74341159b5046605cc2dea9815184ae8b9ee0ac not found: ID does not exist" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.780747 5025 scope.go:117] "RemoveContainer" containerID="a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70" Oct 07 08:37:14 crc kubenswrapper[5025]: E1007 08:37:14.781073 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70\": container with ID starting with a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70 not found: ID does not exist" containerID="a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.781088 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70"} err="failed to get container status \"a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70\": rpc error: code = NotFound desc = could not find container \"a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70\": container with ID starting with a846e9d2f660068003faa53158852392cfc3a09bc5e2cd7c2199f5a066dd4d70 not found: ID does not exist" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.793564 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71b56f8c-4382-4fe6-9674-5b73d51c53ee" (UID: "71b56f8c-4382-4fe6-9674-5b73d51c53ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.813429 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-config-data" (OuterVolumeSpecName: "config-data") pod "71b56f8c-4382-4fe6-9674-5b73d51c53ee" (UID: "71b56f8c-4382-4fe6-9674-5b73d51c53ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.820825 5025 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.820862 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.820877 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b56f8c-4382-4fe6-9674-5b73d51c53ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.931651 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.941636 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.965267 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:14 crc kubenswrapper[5025]: E1007 08:37:14.965674 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="sg-core" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.965694 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="sg-core" Oct 07 08:37:14 crc kubenswrapper[5025]: E1007 08:37:14.965713 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="ceilometer-notification-agent" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.965721 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="ceilometer-notification-agent" Oct 07 08:37:14 crc kubenswrapper[5025]: E1007 08:37:14.965748 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="proxy-httpd" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.965757 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="proxy-httpd" Oct 07 08:37:14 crc kubenswrapper[5025]: E1007 08:37:14.965790 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="ceilometer-central-agent" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.965798 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="ceilometer-central-agent" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.966012 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="ceilometer-notification-agent" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.966041 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="ceilometer-central-agent" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.966051 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="proxy-httpd" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.966073 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" containerName="sg-core" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.968054 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.970913 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.971100 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.971407 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 08:37:14 crc kubenswrapper[5025]: I1007 08:37:14.985168 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.023370 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-config-data\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.023423 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.023578 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-scripts\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.023605 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24qn\" (UniqueName: \"kubernetes.io/projected/8a86e702-ade6-409c-9a35-0d83af92bc0f-kube-api-access-v24qn\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.023636 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-run-httpd\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.023666 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-log-httpd\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.023695 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.023726 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.125991 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-config-data\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.126049 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.126162 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-scripts\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.126206 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v24qn\" (UniqueName: \"kubernetes.io/projected/8a86e702-ade6-409c-9a35-0d83af92bc0f-kube-api-access-v24qn\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.126234 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-run-httpd\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.126259 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-log-httpd\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.126297 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.126327 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.126789 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-run-httpd\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.126982 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-log-httpd\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.129290 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.129444 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.129887 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-scripts\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.130138 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.131949 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-config-data\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.147365 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24qn\" (UniqueName: \"kubernetes.io/projected/8a86e702-ade6-409c-9a35-0d83af92bc0f-kube-api-access-v24qn\") pod \"ceilometer-0\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.284366 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:37:15 crc kubenswrapper[5025]: W1007 08:37:15.782137 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a86e702_ade6_409c_9a35_0d83af92bc0f.slice/crio-bc32ec9ce1e8fb57b0706a8faf16dfec761c09bd28c17830a9a2bb9306f34b31 WatchSource:0}: Error finding container bc32ec9ce1e8fb57b0706a8faf16dfec761c09bd28c17830a9a2bb9306f34b31: Status 404 returned error can't find the container with id bc32ec9ce1e8fb57b0706a8faf16dfec761c09bd28c17830a9a2bb9306f34b31 Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.785345 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:15 crc kubenswrapper[5025]: I1007 08:37:15.924734 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b56f8c-4382-4fe6-9674-5b73d51c53ee" path="/var/lib/kubelet/pods/71b56f8c-4382-4fe6-9674-5b73d51c53ee/volumes" Oct 07 08:37:16 crc kubenswrapper[5025]: I1007 08:37:16.654581 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerStarted","Data":"b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e"} Oct 07 08:37:16 crc kubenswrapper[5025]: I1007 08:37:16.654883 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerStarted","Data":"bc32ec9ce1e8fb57b0706a8faf16dfec761c09bd28c17830a9a2bb9306f34b31"} Oct 07 08:37:16 crc kubenswrapper[5025]: I1007 08:37:16.875201 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.663908 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerStarted","Data":"873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f"} Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.666832 5025 generic.go:334] "Generic (PLEG): container finished" podID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerID="10b9f5700cc4e24590cffcbcde7cac0af1a3631605fcea78d34ab2f307f7a30d" exitCode=0 Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.666873 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"622c3975-f5ce-4bdd-947f-dd79b2b66e57","Type":"ContainerDied","Data":"10b9f5700cc4e24590cffcbcde7cac0af1a3631605fcea78d34ab2f307f7a30d"} Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.666897 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"622c3975-f5ce-4bdd-947f-dd79b2b66e57","Type":"ContainerDied","Data":"7911e093a3abc7ec316831ff445f1b689ebcc359a0f42b777be92a9dba9eb46c"} Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.666908 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7911e093a3abc7ec316831ff445f1b689ebcc359a0f42b777be92a9dba9eb46c" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.730267 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.794032 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-combined-ca-bundle\") pod \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.794127 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-config-data\") pod \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.794255 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c3975-f5ce-4bdd-947f-dd79b2b66e57-logs\") pod \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.794322 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw6qg\" (UniqueName: \"kubernetes.io/projected/622c3975-f5ce-4bdd-947f-dd79b2b66e57-kube-api-access-mw6qg\") pod \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\" (UID: \"622c3975-f5ce-4bdd-947f-dd79b2b66e57\") " Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.795731 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/622c3975-f5ce-4bdd-947f-dd79b2b66e57-logs" (OuterVolumeSpecName: "logs") pod "622c3975-f5ce-4bdd-947f-dd79b2b66e57" (UID: "622c3975-f5ce-4bdd-947f-dd79b2b66e57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.800757 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/622c3975-f5ce-4bdd-947f-dd79b2b66e57-kube-api-access-mw6qg" (OuterVolumeSpecName: "kube-api-access-mw6qg") pod "622c3975-f5ce-4bdd-947f-dd79b2b66e57" (UID: "622c3975-f5ce-4bdd-947f-dd79b2b66e57"). InnerVolumeSpecName "kube-api-access-mw6qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.822392 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "622c3975-f5ce-4bdd-947f-dd79b2b66e57" (UID: "622c3975-f5ce-4bdd-947f-dd79b2b66e57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.843331 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-config-data" (OuterVolumeSpecName: "config-data") pod "622c3975-f5ce-4bdd-947f-dd79b2b66e57" (UID: "622c3975-f5ce-4bdd-947f-dd79b2b66e57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.897722 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.897766 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c3975-f5ce-4bdd-947f-dd79b2b66e57-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.897777 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c3975-f5ce-4bdd-947f-dd79b2b66e57-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.897785 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw6qg\" (UniqueName: \"kubernetes.io/projected/622c3975-f5ce-4bdd-947f-dd79b2b66e57-kube-api-access-mw6qg\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.908783 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:17 crc kubenswrapper[5025]: I1007 08:37:17.928669 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.678511 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.679737 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerStarted","Data":"9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee"} Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.702080 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.713988 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.715602 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.729186 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:18 crc kubenswrapper[5025]: E1007 08:37:18.729625 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-api" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.729648 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-api" Oct 07 08:37:18 crc kubenswrapper[5025]: E1007 08:37:18.729679 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-log" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.729688 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-log" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.729888 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-api" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.729905 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" containerName="nova-api-log" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.730923 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.736821 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.736985 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.740858 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.750841 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.813755 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-public-tls-certs\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.813881 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.813959 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49zpv\" (UniqueName: \"kubernetes.io/projected/c2316326-41f9-441a-8cd3-d986cc8b669d-kube-api-access-49zpv\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.814007 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.814099 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-config-data\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.814149 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2316326-41f9-441a-8cd3-d986cc8b669d-logs\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.876004 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-g9bxn"] Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.877474 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.880171 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.880471 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.884287 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-g9bxn"] Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915051 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-config-data\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915088 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2316326-41f9-441a-8cd3-d986cc8b669d-logs\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915136 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-public-tls-certs\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915163 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-scripts\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915181 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dq7q\" (UniqueName: \"kubernetes.io/projected/40cd73d1-37d6-4334-9962-209f68f8e283-kube-api-access-4dq7q\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915198 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915214 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-config-data\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915419 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915604 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49zpv\" (UniqueName: \"kubernetes.io/projected/c2316326-41f9-441a-8cd3-d986cc8b669d-kube-api-access-49zpv\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915715 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2316326-41f9-441a-8cd3-d986cc8b669d-logs\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.915743 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.919856 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-public-tls-certs\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.920172 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-config-data\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.920840 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.921085 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:18 crc kubenswrapper[5025]: I1007 08:37:18.932254 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49zpv\" (UniqueName: \"kubernetes.io/projected/c2316326-41f9-441a-8cd3-d986cc8b669d-kube-api-access-49zpv\") pod \"nova-api-0\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " pod="openstack/nova-api-0" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.018048 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-scripts\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.018099 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dq7q\" (UniqueName: \"kubernetes.io/projected/40cd73d1-37d6-4334-9962-209f68f8e283-kube-api-access-4dq7q\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.018135 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.018158 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-config-data\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.022230 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.022261 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-scripts\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.022238 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-config-data\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.036064 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dq7q\" (UniqueName: \"kubernetes.io/projected/40cd73d1-37d6-4334-9962-209f68f8e283-kube-api-access-4dq7q\") pod \"nova-cell1-cell-mapping-g9bxn\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.082916 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.193991 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.577138 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.690561 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-g9bxn"] Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.696086 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2316326-41f9-441a-8cd3-d986cc8b669d","Type":"ContainerStarted","Data":"f3a8d431136ff5b93ab9507e79348334c5d9819749d0fb1d284f941c666b87a0"} Oct 07 08:37:19 crc kubenswrapper[5025]: I1007 08:37:19.926768 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="622c3975-f5ce-4bdd-947f-dd79b2b66e57" path="/var/lib/kubelet/pods/622c3975-f5ce-4bdd-947f-dd79b2b66e57/volumes" Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.705738 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g9bxn" event={"ID":"40cd73d1-37d6-4334-9962-209f68f8e283","Type":"ContainerStarted","Data":"314b1fef7c16ec7df031546fcac4dc5d738a574b43a61d84da7fecc9a9489dbb"} Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.706104 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g9bxn" event={"ID":"40cd73d1-37d6-4334-9962-209f68f8e283","Type":"ContainerStarted","Data":"bc391d058c8f768fd4ae95045a79d104bfc173e9c1f875463116e73f65b89c67"} Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.708038 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerStarted","Data":"3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0"} Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.708230 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.708223 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="ceilometer-central-agent" containerID="cri-o://b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e" gracePeriod=30 Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.708257 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="proxy-httpd" containerID="cri-o://3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0" gracePeriod=30 Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.708329 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="sg-core" containerID="cri-o://9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee" gracePeriod=30 Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.708397 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="ceilometer-notification-agent" containerID="cri-o://873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f" gracePeriod=30 Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.712167 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2316326-41f9-441a-8cd3-d986cc8b669d","Type":"ContainerStarted","Data":"66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da"} Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.712207 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2316326-41f9-441a-8cd3-d986cc8b669d","Type":"ContainerStarted","Data":"d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b"} Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.745658 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-g9bxn" podStartSLOduration=2.745636419 podStartE2EDuration="2.745636419s" podCreationTimestamp="2025-10-07 08:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:37:20.723868072 +0000 UTC m=+1247.533182216" watchObservedRunningTime="2025-10-07 08:37:20.745636419 +0000 UTC m=+1247.554950563" Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.754921 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7549003020000002 podStartE2EDuration="2.754900302s" podCreationTimestamp="2025-10-07 08:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:37:20.753621771 +0000 UTC m=+1247.562935925" watchObservedRunningTime="2025-10-07 08:37:20.754900302 +0000 UTC m=+1247.564214456" Oct 07 08:37:20 crc kubenswrapper[5025]: I1007 08:37:20.781300 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.968372862 podStartE2EDuration="6.781279583s" podCreationTimestamp="2025-10-07 08:37:14 +0000 UTC" firstStartedPulling="2025-10-07 08:37:15.784793067 +0000 UTC m=+1242.594107211" lastFinishedPulling="2025-10-07 08:37:19.597699788 +0000 UTC m=+1246.407013932" observedRunningTime="2025-10-07 08:37:20.771140883 +0000 UTC m=+1247.580455057" watchObservedRunningTime="2025-10-07 08:37:20.781279583 +0000 UTC m=+1247.590593737" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.088802 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.161155 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g44d8"] Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.161387 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" podUID="4add832c-94fb-40d1-bdd5-4a3671c38fd7" containerName="dnsmasq-dns" containerID="cri-o://d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077" gracePeriod=10 Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.698451 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.726799 5025 generic.go:334] "Generic (PLEG): container finished" podID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerID="3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0" exitCode=0 Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.726865 5025 generic.go:334] "Generic (PLEG): container finished" podID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerID="9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee" exitCode=2 Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.726892 5025 generic.go:334] "Generic (PLEG): container finished" podID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerID="873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f" exitCode=0 Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.726820 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerDied","Data":"3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0"} Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.726932 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerDied","Data":"9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee"} Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.726944 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerDied","Data":"873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f"} Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.730609 5025 generic.go:334] "Generic (PLEG): container finished" podID="4add832c-94fb-40d1-bdd5-4a3671c38fd7" containerID="d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077" exitCode=0 Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.730677 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.730715 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" event={"ID":"4add832c-94fb-40d1-bdd5-4a3671c38fd7","Type":"ContainerDied","Data":"d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077"} Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.730764 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-g44d8" event={"ID":"4add832c-94fb-40d1-bdd5-4a3671c38fd7","Type":"ContainerDied","Data":"b8cf9bc8794f945dee5aa4d0c2a0faf9a35e8720693d50cd217ce0a00f073379"} Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.730788 5025 scope.go:117] "RemoveContainer" containerID="d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.758870 5025 scope.go:117] "RemoveContainer" containerID="a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.774294 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-swift-storage-0\") pod \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.774462 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-config\") pod \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.774510 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-svc\") pod \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.774597 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-sb\") pod \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.774739 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckfps\" (UniqueName: \"kubernetes.io/projected/4add832c-94fb-40d1-bdd5-4a3671c38fd7-kube-api-access-ckfps\") pod \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.774795 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-nb\") pod \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\" (UID: \"4add832c-94fb-40d1-bdd5-4a3671c38fd7\") " Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.799463 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4add832c-94fb-40d1-bdd5-4a3671c38fd7-kube-api-access-ckfps" (OuterVolumeSpecName: "kube-api-access-ckfps") pod "4add832c-94fb-40d1-bdd5-4a3671c38fd7" (UID: "4add832c-94fb-40d1-bdd5-4a3671c38fd7"). InnerVolumeSpecName "kube-api-access-ckfps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.815178 5025 scope.go:117] "RemoveContainer" containerID="d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077" Oct 07 08:37:21 crc kubenswrapper[5025]: E1007 08:37:21.815601 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077\": container with ID starting with d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077 not found: ID does not exist" containerID="d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.815634 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077"} err="failed to get container status \"d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077\": rpc error: code = NotFound desc = could not find container \"d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077\": container with ID starting with d072ea46983630205128a8130a48e5efca404fd437688e8a5533d5beb56b9077 not found: ID does not exist" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.815654 5025 scope.go:117] "RemoveContainer" containerID="a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef" Oct 07 08:37:21 crc kubenswrapper[5025]: E1007 08:37:21.816047 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef\": container with ID starting with a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef not found: ID does not exist" containerID="a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.816117 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef"} err="failed to get container status \"a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef\": rpc error: code = NotFound desc = could not find container \"a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef\": container with ID starting with a07b757c901b240140f8f54276da9021159a479cb82823d30c7ebd74b3455cef not found: ID does not exist" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.830970 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4add832c-94fb-40d1-bdd5-4a3671c38fd7" (UID: "4add832c-94fb-40d1-bdd5-4a3671c38fd7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.833956 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4add832c-94fb-40d1-bdd5-4a3671c38fd7" (UID: "4add832c-94fb-40d1-bdd5-4a3671c38fd7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.839924 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4add832c-94fb-40d1-bdd5-4a3671c38fd7" (UID: "4add832c-94fb-40d1-bdd5-4a3671c38fd7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.841266 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-config" (OuterVolumeSpecName: "config") pod "4add832c-94fb-40d1-bdd5-4a3671c38fd7" (UID: "4add832c-94fb-40d1-bdd5-4a3671c38fd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.864162 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4add832c-94fb-40d1-bdd5-4a3671c38fd7" (UID: "4add832c-94fb-40d1-bdd5-4a3671c38fd7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.877362 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckfps\" (UniqueName: \"kubernetes.io/projected/4add832c-94fb-40d1-bdd5-4a3671c38fd7-kube-api-access-ckfps\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.877402 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.877414 5025 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.877427 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.877440 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:21 crc kubenswrapper[5025]: I1007 08:37:21.877451 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4add832c-94fb-40d1-bdd5-4a3671c38fd7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.098226 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g44d8"] Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.105885 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-g44d8"] Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.479197 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.593759 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v24qn\" (UniqueName: \"kubernetes.io/projected/8a86e702-ade6-409c-9a35-0d83af92bc0f-kube-api-access-v24qn\") pod \"8a86e702-ade6-409c-9a35-0d83af92bc0f\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.593819 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-config-data\") pod \"8a86e702-ade6-409c-9a35-0d83af92bc0f\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.593948 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-combined-ca-bundle\") pod \"8a86e702-ade6-409c-9a35-0d83af92bc0f\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.593984 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-scripts\") pod \"8a86e702-ade6-409c-9a35-0d83af92bc0f\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.594010 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-sg-core-conf-yaml\") pod \"8a86e702-ade6-409c-9a35-0d83af92bc0f\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.594039 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-ceilometer-tls-certs\") pod \"8a86e702-ade6-409c-9a35-0d83af92bc0f\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.594068 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-log-httpd\") pod \"8a86e702-ade6-409c-9a35-0d83af92bc0f\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.594201 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-run-httpd\") pod \"8a86e702-ade6-409c-9a35-0d83af92bc0f\" (UID: \"8a86e702-ade6-409c-9a35-0d83af92bc0f\") " Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.594821 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a86e702-ade6-409c-9a35-0d83af92bc0f" (UID: "8a86e702-ade6-409c-9a35-0d83af92bc0f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.595004 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a86e702-ade6-409c-9a35-0d83af92bc0f" (UID: "8a86e702-ade6-409c-9a35-0d83af92bc0f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.599252 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a86e702-ade6-409c-9a35-0d83af92bc0f-kube-api-access-v24qn" (OuterVolumeSpecName: "kube-api-access-v24qn") pod "8a86e702-ade6-409c-9a35-0d83af92bc0f" (UID: "8a86e702-ade6-409c-9a35-0d83af92bc0f"). InnerVolumeSpecName "kube-api-access-v24qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.601989 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-scripts" (OuterVolumeSpecName: "scripts") pod "8a86e702-ade6-409c-9a35-0d83af92bc0f" (UID: "8a86e702-ade6-409c-9a35-0d83af92bc0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.626219 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a86e702-ade6-409c-9a35-0d83af92bc0f" (UID: "8a86e702-ade6-409c-9a35-0d83af92bc0f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.653778 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8a86e702-ade6-409c-9a35-0d83af92bc0f" (UID: "8a86e702-ade6-409c-9a35-0d83af92bc0f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.680506 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a86e702-ade6-409c-9a35-0d83af92bc0f" (UID: "8a86e702-ade6-409c-9a35-0d83af92bc0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.698459 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.698499 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v24qn\" (UniqueName: \"kubernetes.io/projected/8a86e702-ade6-409c-9a35-0d83af92bc0f-kube-api-access-v24qn\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.698512 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.698526 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.698537 5025 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.698551 5025 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.698576 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a86e702-ade6-409c-9a35-0d83af92bc0f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.701749 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-config-data" (OuterVolumeSpecName: "config-data") pod "8a86e702-ade6-409c-9a35-0d83af92bc0f" (UID: "8a86e702-ade6-409c-9a35-0d83af92bc0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.760596 5025 generic.go:334] "Generic (PLEG): container finished" podID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerID="b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e" exitCode=0 Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.760651 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerDied","Data":"b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e"} Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.760677 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a86e702-ade6-409c-9a35-0d83af92bc0f","Type":"ContainerDied","Data":"bc32ec9ce1e8fb57b0706a8faf16dfec761c09bd28c17830a9a2bb9306f34b31"} Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.760695 5025 scope.go:117] "RemoveContainer" containerID="3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.760806 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.795743 5025 scope.go:117] "RemoveContainer" containerID="9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.800639 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a86e702-ade6-409c-9a35-0d83af92bc0f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.878812 5025 scope.go:117] "RemoveContainer" containerID="873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.888662 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.925486 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.939972 5025 scope.go:117] "RemoveContainer" containerID="b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.945403 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.946214 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="ceilometer-central-agent" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946240 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="ceilometer-central-agent" Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.946260 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="ceilometer-notification-agent" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946268 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="ceilometer-notification-agent" Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.946309 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4add832c-94fb-40d1-bdd5-4a3671c38fd7" containerName="dnsmasq-dns" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946318 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4add832c-94fb-40d1-bdd5-4a3671c38fd7" containerName="dnsmasq-dns" Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.946335 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="proxy-httpd" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946342 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="proxy-httpd" Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.946357 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4add832c-94fb-40d1-bdd5-4a3671c38fd7" containerName="init" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946365 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4add832c-94fb-40d1-bdd5-4a3671c38fd7" containerName="init" Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.946388 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="sg-core" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946397 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="sg-core" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946639 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="sg-core" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946668 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="proxy-httpd" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946685 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="ceilometer-notification-agent" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946699 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" containerName="ceilometer-central-agent" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.946720 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="4add832c-94fb-40d1-bdd5-4a3671c38fd7" containerName="dnsmasq-dns" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.948930 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.953497 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.954112 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.955326 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.972888 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.982422 5025 scope.go:117] "RemoveContainer" containerID="3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0" Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.985833 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0\": container with ID starting with 3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0 not found: ID does not exist" containerID="3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.985864 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0"} err="failed to get container status \"3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0\": rpc error: code = NotFound desc = could not find container \"3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0\": container with ID starting with 3a1ad418c1e3c760b3ef7f630ae38bae3b3964dd45f8137b63da7d56ef57d0f0 not found: ID does not exist" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.985885 5025 scope.go:117] "RemoveContainer" containerID="9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee" Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.986190 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee\": container with ID starting with 9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee not found: ID does not exist" containerID="9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.986244 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee"} err="failed to get container status \"9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee\": rpc error: code = NotFound desc = could not find container \"9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee\": container with ID starting with 9277cc65770f3b0514a117fca283ee251bf8211ac117dbb110fa52c360e087ee not found: ID does not exist" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.986260 5025 scope.go:117] "RemoveContainer" containerID="873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f" Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.986498 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f\": container with ID starting with 873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f not found: ID does not exist" containerID="873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.986602 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f"} err="failed to get container status \"873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f\": rpc error: code = NotFound desc = could not find container \"873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f\": container with ID starting with 873858a4d219abbe637fb5e258f1c37c153d1865b09ea7c3160bbfaf2f2b050f not found: ID does not exist" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.986741 5025 scope.go:117] "RemoveContainer" containerID="b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e" Oct 07 08:37:22 crc kubenswrapper[5025]: E1007 08:37:22.987063 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e\": container with ID starting with b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e not found: ID does not exist" containerID="b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e" Oct 07 08:37:22 crc kubenswrapper[5025]: I1007 08:37:22.987127 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e"} err="failed to get container status \"b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e\": rpc error: code = NotFound desc = could not find container \"b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e\": container with ID starting with b5fb70bfd48a47f34ee8ca76c936b3ff222f4f5b05eea41b2155572028e73d3e not found: ID does not exist" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.026642 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.026777 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-scripts\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.027669 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-run-httpd\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.028256 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7br\" (UniqueName: \"kubernetes.io/projected/ba0f744b-7d32-4337-be25-f4f6326aaec2-kube-api-access-bd7br\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.028623 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.028743 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.028806 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-config-data\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.028948 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-log-httpd\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.130994 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-log-httpd\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.131364 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.131423 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-scripts\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.131437 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-run-httpd\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.131460 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-log-httpd\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.131481 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7br\" (UniqueName: \"kubernetes.io/projected/ba0f744b-7d32-4337-be25-f4f6326aaec2-kube-api-access-bd7br\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.131597 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.131640 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.131678 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-config-data\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.131932 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-run-httpd\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.135123 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-scripts\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.135287 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.135474 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.136228 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-config-data\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.142191 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.147122 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7br\" (UniqueName: \"kubernetes.io/projected/ba0f744b-7d32-4337-be25-f4f6326aaec2-kube-api-access-bd7br\") pod \"ceilometer-0\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.272411 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.708246 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:37:23 crc kubenswrapper[5025]: W1007 08:37:23.715401 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba0f744b_7d32_4337_be25_f4f6326aaec2.slice/crio-69c77d646d60cef40121eee839e1d3d2a631574a1f61722adde93696addfd418 WatchSource:0}: Error finding container 69c77d646d60cef40121eee839e1d3d2a631574a1f61722adde93696addfd418: Status 404 returned error can't find the container with id 69c77d646d60cef40121eee839e1d3d2a631574a1f61722adde93696addfd418 Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.776826 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerStarted","Data":"69c77d646d60cef40121eee839e1d3d2a631574a1f61722adde93696addfd418"} Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.929197 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4add832c-94fb-40d1-bdd5-4a3671c38fd7" path="/var/lib/kubelet/pods/4add832c-94fb-40d1-bdd5-4a3671c38fd7/volumes" Oct 07 08:37:23 crc kubenswrapper[5025]: I1007 08:37:23.930045 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a86e702-ade6-409c-9a35-0d83af92bc0f" path="/var/lib/kubelet/pods/8a86e702-ade6-409c-9a35-0d83af92bc0f/volumes" Oct 07 08:37:24 crc kubenswrapper[5025]: I1007 08:37:24.787718 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerStarted","Data":"0d4721c409ed563195fffda2f9874618fd3f3cc7746c05dc5decaf962347f14b"} Oct 07 08:37:24 crc kubenswrapper[5025]: I1007 08:37:24.789601 5025 generic.go:334] "Generic (PLEG): container finished" podID="40cd73d1-37d6-4334-9962-209f68f8e283" containerID="314b1fef7c16ec7df031546fcac4dc5d738a574b43a61d84da7fecc9a9489dbb" exitCode=0 Oct 07 08:37:24 crc kubenswrapper[5025]: I1007 08:37:24.789647 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g9bxn" event={"ID":"40cd73d1-37d6-4334-9962-209f68f8e283","Type":"ContainerDied","Data":"314b1fef7c16ec7df031546fcac4dc5d738a574b43a61d84da7fecc9a9489dbb"} Oct 07 08:37:25 crc kubenswrapper[5025]: I1007 08:37:25.817711 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerStarted","Data":"a4937f814a2ec01bb17a146ab80071941530367ba22e6ebb3a6adb3bf515dd0a"} Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:25.934980 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:25.935035 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:25.935077 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:25.935812 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b24ca81daef9a1bfdd6aba62c02272733ea9a1c7e3d4e5bdd90b9e74defc3eb"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:25.935863 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://1b24ca81daef9a1bfdd6aba62c02272733ea9a1c7e3d4e5bdd90b9e74defc3eb" gracePeriod=600 Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.110708 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.187857 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-combined-ca-bundle\") pod \"40cd73d1-37d6-4334-9962-209f68f8e283\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.187901 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-scripts\") pod \"40cd73d1-37d6-4334-9962-209f68f8e283\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.188021 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-config-data\") pod \"40cd73d1-37d6-4334-9962-209f68f8e283\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.188083 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dq7q\" (UniqueName: \"kubernetes.io/projected/40cd73d1-37d6-4334-9962-209f68f8e283-kube-api-access-4dq7q\") pod \"40cd73d1-37d6-4334-9962-209f68f8e283\" (UID: \"40cd73d1-37d6-4334-9962-209f68f8e283\") " Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.194328 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-scripts" (OuterVolumeSpecName: "scripts") pod "40cd73d1-37d6-4334-9962-209f68f8e283" (UID: "40cd73d1-37d6-4334-9962-209f68f8e283"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.195677 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cd73d1-37d6-4334-9962-209f68f8e283-kube-api-access-4dq7q" (OuterVolumeSpecName: "kube-api-access-4dq7q") pod "40cd73d1-37d6-4334-9962-209f68f8e283" (UID: "40cd73d1-37d6-4334-9962-209f68f8e283"). InnerVolumeSpecName "kube-api-access-4dq7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.234996 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40cd73d1-37d6-4334-9962-209f68f8e283" (UID: "40cd73d1-37d6-4334-9962-209f68f8e283"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.252921 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-config-data" (OuterVolumeSpecName: "config-data") pod "40cd73d1-37d6-4334-9962-209f68f8e283" (UID: "40cd73d1-37d6-4334-9962-209f68f8e283"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.290157 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.290182 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.290192 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40cd73d1-37d6-4334-9962-209f68f8e283-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.290201 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dq7q\" (UniqueName: \"kubernetes.io/projected/40cd73d1-37d6-4334-9962-209f68f8e283-kube-api-access-4dq7q\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.840952 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerStarted","Data":"7b52de27036b56d8d85f64dd500bd9547ec47d9718e92cc57aa2c682ce41fe24"} Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.846531 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g9bxn" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.846779 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g9bxn" event={"ID":"40cd73d1-37d6-4334-9962-209f68f8e283","Type":"ContainerDied","Data":"bc391d058c8f768fd4ae95045a79d104bfc173e9c1f875463116e73f65b89c67"} Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.846910 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc391d058c8f768fd4ae95045a79d104bfc173e9c1f875463116e73f65b89c67" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.850606 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="1b24ca81daef9a1bfdd6aba62c02272733ea9a1c7e3d4e5bdd90b9e74defc3eb" exitCode=0 Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.850660 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"1b24ca81daef9a1bfdd6aba62c02272733ea9a1c7e3d4e5bdd90b9e74defc3eb"} Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.850833 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"13140e5c73fc6f443f1a7e1d646b5e6ed1f83f36c3c46081a22a0e60d0f1c23e"} Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.850904 5025 scope.go:117] "RemoveContainer" containerID="c91f90218018dc6579e30a1d0a59af84fe0152b7bc9f23fa2787bc6ffa336d31" Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.996526 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.996869 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerName="nova-api-log" containerID="cri-o://d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b" gracePeriod=30 Oct 07 08:37:26 crc kubenswrapper[5025]: I1007 08:37:26.997015 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerName="nova-api-api" containerID="cri-o://66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da" gracePeriod=30 Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.012742 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.013279 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="039af421-8cdd-4a32-bccc-f58f90b64fad" containerName="nova-scheduler-scheduler" containerID="cri-o://a338657f8f197d0f0f88660e3685ef3ce6e0e74f42231157db1d2b33a939958a" gracePeriod=30 Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.034511 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.035220 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-log" containerID="cri-o://d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd" gracePeriod=30 Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.035466 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-metadata" containerID="cri-o://68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c" gracePeriod=30 Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.764496 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.819858 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-internal-tls-certs\") pod \"c2316326-41f9-441a-8cd3-d986cc8b669d\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.819915 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49zpv\" (UniqueName: \"kubernetes.io/projected/c2316326-41f9-441a-8cd3-d986cc8b669d-kube-api-access-49zpv\") pod \"c2316326-41f9-441a-8cd3-d986cc8b669d\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.820044 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-config-data\") pod \"c2316326-41f9-441a-8cd3-d986cc8b669d\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.820089 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2316326-41f9-441a-8cd3-d986cc8b669d-logs\") pod \"c2316326-41f9-441a-8cd3-d986cc8b669d\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.820153 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-public-tls-certs\") pod \"c2316326-41f9-441a-8cd3-d986cc8b669d\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.820232 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-combined-ca-bundle\") pod \"c2316326-41f9-441a-8cd3-d986cc8b669d\" (UID: \"c2316326-41f9-441a-8cd3-d986cc8b669d\") " Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.820935 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2316326-41f9-441a-8cd3-d986cc8b669d-logs" (OuterVolumeSpecName: "logs") pod "c2316326-41f9-441a-8cd3-d986cc8b669d" (UID: "c2316326-41f9-441a-8cd3-d986cc8b669d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.825783 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2316326-41f9-441a-8cd3-d986cc8b669d-kube-api-access-49zpv" (OuterVolumeSpecName: "kube-api-access-49zpv") pod "c2316326-41f9-441a-8cd3-d986cc8b669d" (UID: "c2316326-41f9-441a-8cd3-d986cc8b669d"). InnerVolumeSpecName "kube-api-access-49zpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.833803 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49zpv\" (UniqueName: \"kubernetes.io/projected/c2316326-41f9-441a-8cd3-d986cc8b669d-kube-api-access-49zpv\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.833832 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2316326-41f9-441a-8cd3-d986cc8b669d-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.853504 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2316326-41f9-441a-8cd3-d986cc8b669d" (UID: "c2316326-41f9-441a-8cd3-d986cc8b669d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.865062 5025 generic.go:334] "Generic (PLEG): container finished" podID="039af421-8cdd-4a32-bccc-f58f90b64fad" containerID="a338657f8f197d0f0f88660e3685ef3ce6e0e74f42231157db1d2b33a939958a" exitCode=0 Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.865132 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"039af421-8cdd-4a32-bccc-f58f90b64fad","Type":"ContainerDied","Data":"a338657f8f197d0f0f88660e3685ef3ce6e0e74f42231157db1d2b33a939958a"} Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.869063 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-config-data" (OuterVolumeSpecName: "config-data") pod "c2316326-41f9-441a-8cd3-d986cc8b669d" (UID: "c2316326-41f9-441a-8cd3-d986cc8b669d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.874931 5025 generic.go:334] "Generic (PLEG): container finished" podID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerID="66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da" exitCode=0 Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.874974 5025 generic.go:334] "Generic (PLEG): container finished" podID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerID="d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b" exitCode=143 Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.875065 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2316326-41f9-441a-8cd3-d986cc8b669d","Type":"ContainerDied","Data":"66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da"} Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.875105 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2316326-41f9-441a-8cd3-d986cc8b669d","Type":"ContainerDied","Data":"d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b"} Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.875119 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2316326-41f9-441a-8cd3-d986cc8b669d","Type":"ContainerDied","Data":"f3a8d431136ff5b93ab9507e79348334c5d9819749d0fb1d284f941c666b87a0"} Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.875139 5025 scope.go:117] "RemoveContainer" containerID="66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.875262 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.877845 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c2316326-41f9-441a-8cd3-d986cc8b669d" (UID: "c2316326-41f9-441a-8cd3-d986cc8b669d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.878888 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerStarted","Data":"a1a35e09f930c853a6483e0e5e192b2b3f584932be09beeb5bd496d456b573a1"} Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.879053 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.879821 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c2316326-41f9-441a-8cd3-d986cc8b669d" (UID: "c2316326-41f9-441a-8cd3-d986cc8b669d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.882869 5025 generic.go:334] "Generic (PLEG): container finished" podID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerID="d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd" exitCode=143 Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.882940 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fcd471e2-f72d-4cef-af5c-b5624faaae95","Type":"ContainerDied","Data":"d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd"} Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.907205 5025 scope.go:117] "RemoveContainer" containerID="d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.909769 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.435643955 podStartE2EDuration="5.907164301s" podCreationTimestamp="2025-10-07 08:37:22 +0000 UTC" firstStartedPulling="2025-10-07 08:37:23.720222724 +0000 UTC m=+1250.529536868" lastFinishedPulling="2025-10-07 08:37:27.19174306 +0000 UTC m=+1254.001057214" observedRunningTime="2025-10-07 08:37:27.899866561 +0000 UTC m=+1254.709180715" watchObservedRunningTime="2025-10-07 08:37:27.907164301 +0000 UTC m=+1254.716478455" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.934355 5025 scope.go:117] "RemoveContainer" containerID="66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da" Oct 07 08:37:27 crc kubenswrapper[5025]: E1007 08:37:27.934880 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da\": container with ID starting with 66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da not found: ID does not exist" containerID="66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.934920 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da"} err="failed to get container status \"66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da\": rpc error: code = NotFound desc = could not find container \"66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da\": container with ID starting with 66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da not found: ID does not exist" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.934944 5025 scope.go:117] "RemoveContainer" containerID="d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b" Oct 07 08:37:27 crc kubenswrapper[5025]: E1007 08:37:27.935241 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b\": container with ID starting with d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b not found: ID does not exist" containerID="d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.935332 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b"} err="failed to get container status \"d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b\": rpc error: code = NotFound desc = could not find container \"d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b\": container with ID starting with d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b not found: ID does not exist" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.935409 5025 scope.go:117] "RemoveContainer" containerID="66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.935279 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.935604 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.935836 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.935943 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2316326-41f9-441a-8cd3-d986cc8b669d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.935851 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da"} err="failed to get container status \"66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da\": rpc error: code = NotFound desc = could not find container \"66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da\": container with ID starting with 66fbf2b416f28317a522fd4a6301917eb8481b17c997782224ede84d8d9f45da not found: ID does not exist" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.936040 5025 scope.go:117] "RemoveContainer" containerID="d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b" Oct 07 08:37:27 crc kubenswrapper[5025]: I1007 08:37:27.936273 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b"} err="failed to get container status \"d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b\": rpc error: code = NotFound desc = could not find container \"d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b\": container with ID starting with d3954bc93fd6db00cda848534619f62c91661928f7794ebe85d31fd07864759b not found: ID does not exist" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.202940 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.212909 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.239314 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:28 crc kubenswrapper[5025]: E1007 08:37:28.239835 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerName="nova-api-log" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.239859 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerName="nova-api-log" Oct 07 08:37:28 crc kubenswrapper[5025]: E1007 08:37:28.239899 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cd73d1-37d6-4334-9962-209f68f8e283" containerName="nova-manage" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.239908 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cd73d1-37d6-4334-9962-209f68f8e283" containerName="nova-manage" Oct 07 08:37:28 crc kubenswrapper[5025]: E1007 08:37:28.239930 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerName="nova-api-api" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.239938 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerName="nova-api-api" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.240171 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerName="nova-api-api" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.240191 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cd73d1-37d6-4334-9962-209f68f8e283" containerName="nova-manage" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.240221 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2316326-41f9-441a-8cd3-d986cc8b669d" containerName="nova-api-log" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.241434 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.248021 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.248324 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.248551 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.253214 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.345573 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.345637 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qn27\" (UniqueName: \"kubernetes.io/projected/72a63b89-4f79-41a7-9e9a-e3c464d015a4-kube-api-access-9qn27\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.345686 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.345915 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-public-tls-certs\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.346037 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a63b89-4f79-41a7-9e9a-e3c464d015a4-logs\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.346164 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-config-data\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.396651 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.451215 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sln42\" (UniqueName: \"kubernetes.io/projected/039af421-8cdd-4a32-bccc-f58f90b64fad-kube-api-access-sln42\") pod \"039af421-8cdd-4a32-bccc-f58f90b64fad\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.451372 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-combined-ca-bundle\") pod \"039af421-8cdd-4a32-bccc-f58f90b64fad\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.451396 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-config-data\") pod \"039af421-8cdd-4a32-bccc-f58f90b64fad\" (UID: \"039af421-8cdd-4a32-bccc-f58f90b64fad\") " Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.451658 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qn27\" (UniqueName: \"kubernetes.io/projected/72a63b89-4f79-41a7-9e9a-e3c464d015a4-kube-api-access-9qn27\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.451722 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.451765 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-public-tls-certs\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.451803 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a63b89-4f79-41a7-9e9a-e3c464d015a4-logs\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.451845 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-config-data\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.451883 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.452760 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a63b89-4f79-41a7-9e9a-e3c464d015a4-logs\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.460864 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-config-data\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.460915 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.464872 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.469620 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-public-tls-certs\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.473882 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039af421-8cdd-4a32-bccc-f58f90b64fad-kube-api-access-sln42" (OuterVolumeSpecName: "kube-api-access-sln42") pod "039af421-8cdd-4a32-bccc-f58f90b64fad" (UID: "039af421-8cdd-4a32-bccc-f58f90b64fad"). InnerVolumeSpecName "kube-api-access-sln42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.475129 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qn27\" (UniqueName: \"kubernetes.io/projected/72a63b89-4f79-41a7-9e9a-e3c464d015a4-kube-api-access-9qn27\") pod \"nova-api-0\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.492646 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-config-data" (OuterVolumeSpecName: "config-data") pod "039af421-8cdd-4a32-bccc-f58f90b64fad" (UID: "039af421-8cdd-4a32-bccc-f58f90b64fad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.502161 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "039af421-8cdd-4a32-bccc-f58f90b64fad" (UID: "039af421-8cdd-4a32-bccc-f58f90b64fad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.553310 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.553348 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039af421-8cdd-4a32-bccc-f58f90b64fad-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.553359 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sln42\" (UniqueName: \"kubernetes.io/projected/039af421-8cdd-4a32-bccc-f58f90b64fad-kube-api-access-sln42\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.564648 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.906351 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"039af421-8cdd-4a32-bccc-f58f90b64fad","Type":"ContainerDied","Data":"a0aff233a1acb54b1c008f8526b94eefa0e5ddd8b8d72844ee1f7d63f16db7f6"} Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.906418 5025 scope.go:117] "RemoveContainer" containerID="a338657f8f197d0f0f88660e3685ef3ce6e0e74f42231157db1d2b33a939958a" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.906590 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.946073 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.964100 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.984723 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:37:28 crc kubenswrapper[5025]: E1007 08:37:28.985169 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039af421-8cdd-4a32-bccc-f58f90b64fad" containerName="nova-scheduler-scheduler" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.985182 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="039af421-8cdd-4a32-bccc-f58f90b64fad" containerName="nova-scheduler-scheduler" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.985360 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="039af421-8cdd-4a32-bccc-f58f90b64fad" containerName="nova-scheduler-scheduler" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.986021 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.993167 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:37:28 crc kubenswrapper[5025]: I1007 08:37:28.996142 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.014076 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:37:29 crc kubenswrapper[5025]: W1007 08:37:29.019123 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72a63b89_4f79_41a7_9e9a_e3c464d015a4.slice/crio-8c50e11b253e4d9a2871b39f640867158387c82511f814263844b17932593ce1 WatchSource:0}: Error finding container 8c50e11b253e4d9a2871b39f640867158387c82511f814263844b17932593ce1: Status 404 returned error can't find the container with id 8c50e11b253e4d9a2871b39f640867158387c82511f814263844b17932593ce1 Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.063950 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.064126 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2p2m\" (UniqueName: \"kubernetes.io/projected/8691f972-205f-470b-b300-40c32106704b-kube-api-access-v2p2m\") pod \"nova-scheduler-0\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.064255 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-config-data\") pod \"nova-scheduler-0\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.172110 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2p2m\" (UniqueName: \"kubernetes.io/projected/8691f972-205f-470b-b300-40c32106704b-kube-api-access-v2p2m\") pod \"nova-scheduler-0\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.172196 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-config-data\") pod \"nova-scheduler-0\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.172480 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.178633 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-config-data\") pod \"nova-scheduler-0\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.179116 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.189941 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2p2m\" (UniqueName: \"kubernetes.io/projected/8691f972-205f-470b-b300-40c32106704b-kube-api-access-v2p2m\") pod \"nova-scheduler-0\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.309363 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.731799 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:37:29 crc kubenswrapper[5025]: W1007 08:37:29.732600 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8691f972_205f_470b_b300_40c32106704b.slice/crio-861bed56c80e581c2932c0abae9710153d8f70c1568b37c7027b67d001030917 WatchSource:0}: Error finding container 861bed56c80e581c2932c0abae9710153d8f70c1568b37c7027b67d001030917: Status 404 returned error can't find the container with id 861bed56c80e581c2932c0abae9710153d8f70c1568b37c7027b67d001030917 Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.934154 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="039af421-8cdd-4a32-bccc-f58f90b64fad" path="/var/lib/kubelet/pods/039af421-8cdd-4a32-bccc-f58f90b64fad/volumes" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.935655 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2316326-41f9-441a-8cd3-d986cc8b669d" path="/var/lib/kubelet/pods/c2316326-41f9-441a-8cd3-d986cc8b669d/volumes" Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.936864 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8691f972-205f-470b-b300-40c32106704b","Type":"ContainerStarted","Data":"861bed56c80e581c2932c0abae9710153d8f70c1568b37c7027b67d001030917"} Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.938999 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72a63b89-4f79-41a7-9e9a-e3c464d015a4","Type":"ContainerStarted","Data":"049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38"} Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.939134 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72a63b89-4f79-41a7-9e9a-e3c464d015a4","Type":"ContainerStarted","Data":"231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263"} Oct 07 08:37:29 crc kubenswrapper[5025]: I1007 08:37:29.939215 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72a63b89-4f79-41a7-9e9a-e3c464d015a4","Type":"ContainerStarted","Data":"8c50e11b253e4d9a2871b39f640867158387c82511f814263844b17932593ce1"} Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.181762 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:35400->10.217.0.192:8775: read: connection reset by peer" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.181872 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:35416->10.217.0.192:8775: read: connection reset by peer" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.637387 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.658630 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6586054089999998 podStartE2EDuration="2.658605409s" podCreationTimestamp="2025-10-07 08:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:37:29.944138638 +0000 UTC m=+1256.753452802" watchObservedRunningTime="2025-10-07 08:37:30.658605409 +0000 UTC m=+1257.467919563" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.709628 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-config-data\") pod \"fcd471e2-f72d-4cef-af5c-b5624faaae95\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.709777 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-combined-ca-bundle\") pod \"fcd471e2-f72d-4cef-af5c-b5624faaae95\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.710888 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xml5f\" (UniqueName: \"kubernetes.io/projected/fcd471e2-f72d-4cef-af5c-b5624faaae95-kube-api-access-xml5f\") pod \"fcd471e2-f72d-4cef-af5c-b5624faaae95\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.711164 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-nova-metadata-tls-certs\") pod \"fcd471e2-f72d-4cef-af5c-b5624faaae95\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.711624 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd471e2-f72d-4cef-af5c-b5624faaae95-logs\") pod \"fcd471e2-f72d-4cef-af5c-b5624faaae95\" (UID: \"fcd471e2-f72d-4cef-af5c-b5624faaae95\") " Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.712393 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcd471e2-f72d-4cef-af5c-b5624faaae95-logs" (OuterVolumeSpecName: "logs") pod "fcd471e2-f72d-4cef-af5c-b5624faaae95" (UID: "fcd471e2-f72d-4cef-af5c-b5624faaae95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.734000 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd471e2-f72d-4cef-af5c-b5624faaae95-kube-api-access-xml5f" (OuterVolumeSpecName: "kube-api-access-xml5f") pod "fcd471e2-f72d-4cef-af5c-b5624faaae95" (UID: "fcd471e2-f72d-4cef-af5c-b5624faaae95"). InnerVolumeSpecName "kube-api-access-xml5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.744989 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-config-data" (OuterVolumeSpecName: "config-data") pod "fcd471e2-f72d-4cef-af5c-b5624faaae95" (UID: "fcd471e2-f72d-4cef-af5c-b5624faaae95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.759059 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcd471e2-f72d-4cef-af5c-b5624faaae95" (UID: "fcd471e2-f72d-4cef-af5c-b5624faaae95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.792183 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fcd471e2-f72d-4cef-af5c-b5624faaae95" (UID: "fcd471e2-f72d-4cef-af5c-b5624faaae95"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.813568 5025 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.813601 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd471e2-f72d-4cef-af5c-b5624faaae95-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.813613 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.813621 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd471e2-f72d-4cef-af5c-b5624faaae95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.813629 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xml5f\" (UniqueName: \"kubernetes.io/projected/fcd471e2-f72d-4cef-af5c-b5624faaae95-kube-api-access-xml5f\") on node \"crc\" DevicePath \"\"" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.935643 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8691f972-205f-470b-b300-40c32106704b","Type":"ContainerStarted","Data":"8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700"} Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.940321 5025 generic.go:334] "Generic (PLEG): container finished" podID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerID="68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c" exitCode=0 Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.940943 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.946121 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fcd471e2-f72d-4cef-af5c-b5624faaae95","Type":"ContainerDied","Data":"68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c"} Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.946155 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fcd471e2-f72d-4cef-af5c-b5624faaae95","Type":"ContainerDied","Data":"bf300f47187f1d625c0fcce2082883114cad261aa9cc11730d93608e360b6a58"} Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.946176 5025 scope.go:117] "RemoveContainer" containerID="68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.965030 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.965009021 podStartE2EDuration="2.965009021s" podCreationTimestamp="2025-10-07 08:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:37:30.956024528 +0000 UTC m=+1257.765338672" watchObservedRunningTime="2025-10-07 08:37:30.965009021 +0000 UTC m=+1257.774323165" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.977724 5025 scope.go:117] "RemoveContainer" containerID="d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.978058 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.986207 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.998525 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:37:30 crc kubenswrapper[5025]: E1007 08:37:30.998910 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-metadata" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.998924 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-metadata" Oct 07 08:37:30 crc kubenswrapper[5025]: E1007 08:37:30.998947 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-log" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.998954 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-log" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.999337 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-metadata" Oct 07 08:37:30 crc kubenswrapper[5025]: I1007 08:37:30.999371 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" containerName="nova-metadata-log" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.000499 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.007280 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.007292 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.021893 5025 scope.go:117] "RemoveContainer" containerID="68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c" Oct 07 08:37:31 crc kubenswrapper[5025]: E1007 08:37:31.022983 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c\": container with ID starting with 68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c not found: ID does not exist" containerID="68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.023010 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c"} err="failed to get container status \"68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c\": rpc error: code = NotFound desc = could not find container \"68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c\": container with ID starting with 68ddf4f70dba9f38c339070fcd4cc2e430e2674bfdce889789582e525fe6fb4c not found: ID does not exist" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.023029 5025 scope.go:117] "RemoveContainer" containerID="d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd" Oct 07 08:37:31 crc kubenswrapper[5025]: E1007 08:37:31.024421 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd\": container with ID starting with d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd not found: ID does not exist" containerID="d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.024472 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd"} err="failed to get container status \"d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd\": rpc error: code = NotFound desc = could not find container \"d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd\": container with ID starting with d4794fecf5365ce5fef677e52eb3242e5364383743171e19431b5709890212dd not found: ID does not exist" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.033787 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.117604 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-config-data\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.117939 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.117987 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a892055-924e-4e4f-a625-b15b8a39c4bf-logs\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.118063 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.118184 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99rzk\" (UniqueName: \"kubernetes.io/projected/4a892055-924e-4e4f-a625-b15b8a39c4bf-kube-api-access-99rzk\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.221761 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.221848 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a892055-924e-4e4f-a625-b15b8a39c4bf-logs\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.221918 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.222016 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99rzk\" (UniqueName: \"kubernetes.io/projected/4a892055-924e-4e4f-a625-b15b8a39c4bf-kube-api-access-99rzk\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.222078 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-config-data\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.222616 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a892055-924e-4e4f-a625-b15b8a39c4bf-logs\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.227048 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.227151 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-config-data\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.231073 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.238957 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99rzk\" (UniqueName: \"kubernetes.io/projected/4a892055-924e-4e4f-a625-b15b8a39c4bf-kube-api-access-99rzk\") pod \"nova-metadata-0\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.327505 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:37:31 crc kubenswrapper[5025]: W1007 08:37:31.861026 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a892055_924e_4e4f_a625_b15b8a39c4bf.slice/crio-31a1770d51ee4ed552c5bf90847456f2b0e0c8dd3bcd6e05a7d9136c8e1ef3ff WatchSource:0}: Error finding container 31a1770d51ee4ed552c5bf90847456f2b0e0c8dd3bcd6e05a7d9136c8e1ef3ff: Status 404 returned error can't find the container with id 31a1770d51ee4ed552c5bf90847456f2b0e0c8dd3bcd6e05a7d9136c8e1ef3ff Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.865870 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.927272 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd471e2-f72d-4cef-af5c-b5624faaae95" path="/var/lib/kubelet/pods/fcd471e2-f72d-4cef-af5c-b5624faaae95/volumes" Oct 07 08:37:31 crc kubenswrapper[5025]: I1007 08:37:31.958904 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a892055-924e-4e4f-a625-b15b8a39c4bf","Type":"ContainerStarted","Data":"31a1770d51ee4ed552c5bf90847456f2b0e0c8dd3bcd6e05a7d9136c8e1ef3ff"} Oct 07 08:37:32 crc kubenswrapper[5025]: I1007 08:37:32.970523 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a892055-924e-4e4f-a625-b15b8a39c4bf","Type":"ContainerStarted","Data":"c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5"} Oct 07 08:37:32 crc kubenswrapper[5025]: I1007 08:37:32.970856 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a892055-924e-4e4f-a625-b15b8a39c4bf","Type":"ContainerStarted","Data":"793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0"} Oct 07 08:37:34 crc kubenswrapper[5025]: I1007 08:37:34.310462 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 08:37:36 crc kubenswrapper[5025]: I1007 08:37:36.328156 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 08:37:36 crc kubenswrapper[5025]: I1007 08:37:36.328825 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 08:37:38 crc kubenswrapper[5025]: I1007 08:37:38.565137 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 08:37:38 crc kubenswrapper[5025]: I1007 08:37:38.565457 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 08:37:39 crc kubenswrapper[5025]: I1007 08:37:39.310273 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 08:37:39 crc kubenswrapper[5025]: I1007 08:37:39.338227 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 08:37:39 crc kubenswrapper[5025]: I1007 08:37:39.359949 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=9.359934409 podStartE2EDuration="9.359934409s" podCreationTimestamp="2025-10-07 08:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 08:37:32.991921191 +0000 UTC m=+1259.801235355" watchObservedRunningTime="2025-10-07 08:37:39.359934409 +0000 UTC m=+1266.169248553" Oct 07 08:37:39 crc kubenswrapper[5025]: I1007 08:37:39.578707 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 08:37:39 crc kubenswrapper[5025]: I1007 08:37:39.579028 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 08:37:40 crc kubenswrapper[5025]: I1007 08:37:40.082026 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 08:37:41 crc kubenswrapper[5025]: I1007 08:37:41.328353 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 08:37:41 crc kubenswrapper[5025]: I1007 08:37:41.328759 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 08:37:42 crc kubenswrapper[5025]: I1007 08:37:42.340814 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 08:37:42 crc kubenswrapper[5025]: I1007 08:37:42.340821 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 08:37:48 crc kubenswrapper[5025]: I1007 08:37:48.571188 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 08:37:48 crc kubenswrapper[5025]: I1007 08:37:48.574351 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 08:37:48 crc kubenswrapper[5025]: I1007 08:37:48.574927 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 08:37:48 crc kubenswrapper[5025]: I1007 08:37:48.580811 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 08:37:49 crc kubenswrapper[5025]: I1007 08:37:49.149926 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 08:37:49 crc kubenswrapper[5025]: I1007 08:37:49.157065 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 08:37:51 crc kubenswrapper[5025]: I1007 08:37:51.342324 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 08:37:51 crc kubenswrapper[5025]: I1007 08:37:51.342921 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 08:37:51 crc kubenswrapper[5025]: I1007 08:37:51.352143 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 08:37:51 crc kubenswrapper[5025]: I1007 08:37:51.352453 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 08:37:53 crc kubenswrapper[5025]: I1007 08:37:53.284016 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.303023 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.303962 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="d13fcf27-6664-430e-b9ac-81ff65769a0c" containerName="openstackclient" containerID="cri-o://39701f3461ec57f87ded44047ecffc35edf679b1b33f29a9dedea8b73829426d" gracePeriod=2 Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.324525 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.676492 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.676773 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerName="ovn-northd" containerID="cri-o://ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7" gracePeriod=30 Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.677191 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerName="openstack-network-exporter" containerID="cri-o://abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede" gracePeriod=30 Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.700664 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinderb478-account-delete-p6rj5"] Oct 07 08:38:12 crc kubenswrapper[5025]: E1007 08:38:12.701211 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13fcf27-6664-430e-b9ac-81ff65769a0c" containerName="openstackclient" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.701224 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13fcf27-6664-430e-b9ac-81ff65769a0c" containerName="openstackclient" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.701450 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13fcf27-6664-430e-b9ac-81ff65769a0c" containerName="openstackclient" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.702311 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderb478-account-delete-p6rj5" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.715604 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.721628 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderb478-account-delete-p6rj5"] Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.731879 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance4544-account-delete-qm7lc"] Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.733477 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4544-account-delete-qm7lc" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.766982 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance4544-account-delete-qm7lc"] Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.779091 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnpd7\" (UniqueName: \"kubernetes.io/projected/c87571c0-a1b9-4187-a60c-dd66d9fbb301-kube-api-access-tnpd7\") pod \"cinderb478-account-delete-p6rj5\" (UID: \"c87571c0-a1b9-4187-a60c-dd66d9fbb301\") " pod="openstack/cinderb478-account-delete-p6rj5" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.881680 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsl9p\" (UniqueName: \"kubernetes.io/projected/28643c48-ee4c-4880-8901-9e3231c70b70-kube-api-access-zsl9p\") pod \"glance4544-account-delete-qm7lc\" (UID: \"28643c48-ee4c-4880-8901-9e3231c70b70\") " pod="openstack/glance4544-account-delete-qm7lc" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.881850 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnpd7\" (UniqueName: \"kubernetes.io/projected/c87571c0-a1b9-4187-a60c-dd66d9fbb301-kube-api-access-tnpd7\") pod \"cinderb478-account-delete-p6rj5\" (UID: \"c87571c0-a1b9-4187-a60c-dd66d9fbb301\") " pod="openstack/cinderb478-account-delete-p6rj5" Oct 07 08:38:12 crc kubenswrapper[5025]: E1007 08:38:12.883363 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:12 crc kubenswrapper[5025]: E1007 08:38:12.883421 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data podName:e5a4ae5a-0b64-481f-a54d-2263de0eae8e nodeName:}" failed. No retries permitted until 2025-10-07 08:38:13.383396645 +0000 UTC m=+1300.192710789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data") pod "rabbitmq-cell1-server-0" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e") : configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.926059 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnpd7\" (UniqueName: \"kubernetes.io/projected/c87571c0-a1b9-4187-a60c-dd66d9fbb301-kube-api-access-tnpd7\") pod \"cinderb478-account-delete-p6rj5\" (UID: \"c87571c0-a1b9-4187-a60c-dd66d9fbb301\") " pod="openstack/cinderb478-account-delete-p6rj5" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.931370 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement5528-account-delete-wnxjq"] Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.932678 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement5528-account-delete-wnxjq" Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.965613 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement5528-account-delete-wnxjq"] Oct 07 08:38:12 crc kubenswrapper[5025]: I1007 08:38:12.983878 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsl9p\" (UniqueName: \"kubernetes.io/projected/28643c48-ee4c-4880-8901-9e3231c70b70-kube-api-access-zsl9p\") pod \"glance4544-account-delete-qm7lc\" (UID: \"28643c48-ee4c-4880-8901-9e3231c70b70\") " pod="openstack/glance4544-account-delete-qm7lc" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.002728 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-2xx6h"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.023170 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsl9p\" (UniqueName: \"kubernetes.io/projected/28643c48-ee4c-4880-8901-9e3231c70b70-kube-api-access-zsl9p\") pod \"glance4544-account-delete-qm7lc\" (UID: \"28643c48-ee4c-4880-8901-9e3231c70b70\") " pod="openstack/glance4544-account-delete-qm7lc" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.063295 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-2xx6h"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.080677 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderb478-account-delete-p6rj5" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.089754 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjhn\" (UniqueName: \"kubernetes.io/projected/9740b484-8952-4253-9158-17164236ffcc-kube-api-access-6bjhn\") pod \"placement5528-account-delete-wnxjq\" (UID: \"9740b484-8952-4253-9158-17164236ffcc\") " pod="openstack/placement5528-account-delete-wnxjq" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.099917 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-lms8w"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.107603 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bxmbm"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.134305 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-cb69l"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.134527 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-cb69l" podUID="7f76b8e5-9257-4a0f-8067-ac36ccbe6711" containerName="openstack-network-exporter" containerID="cri-o://c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20" gracePeriod=30 Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.177706 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron0ee4-account-delete-8cznp"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.178843 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0ee4-account-delete-8cznp" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.201958 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron0ee4-account-delete-8cznp"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.202737 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjhn\" (UniqueName: \"kubernetes.io/projected/9740b484-8952-4253-9158-17164236ffcc-kube-api-access-6bjhn\") pod \"placement5528-account-delete-wnxjq\" (UID: \"9740b484-8952-4253-9158-17164236ffcc\") " pod="openstack/placement5528-account-delete-wnxjq" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.220572 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.225938 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4544-account-delete-qm7lc" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.262027 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjhn\" (UniqueName: \"kubernetes.io/projected/9740b484-8952-4253-9158-17164236ffcc-kube-api-access-6bjhn\") pod \"placement5528-account-delete-wnxjq\" (UID: \"9740b484-8952-4253-9158-17164236ffcc\") " pod="openstack/placement5528-account-delete-wnxjq" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.303946 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjh7m\" (UniqueName: \"kubernetes.io/projected/b956a819-1c8f-471c-a738-fe4d78fbbb98-kube-api-access-gjh7m\") pod \"neutron0ee4-account-delete-8cznp\" (UID: \"b956a819-1c8f-471c-a738-fe4d78fbbb98\") " pod="openstack/neutron0ee4-account-delete-8cznp" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.319465 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement5528-account-delete-wnxjq" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.411075 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjh7m\" (UniqueName: \"kubernetes.io/projected/b956a819-1c8f-471c-a738-fe4d78fbbb98-kube-api-access-gjh7m\") pod \"neutron0ee4-account-delete-8cznp\" (UID: \"b956a819-1c8f-471c-a738-fe4d78fbbb98\") " pod="openstack/neutron0ee4-account-delete-8cznp" Oct 07 08:38:13 crc kubenswrapper[5025]: E1007 08:38:13.411460 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 08:38:13 crc kubenswrapper[5025]: E1007 08:38:13.411501 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data podName:d46577dd-b38b-4b80-ad57-577629e648b8 nodeName:}" failed. No retries permitted until 2025-10-07 08:38:13.911486408 +0000 UTC m=+1300.720800552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data") pod "rabbitmq-server-0" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8") : configmap "rabbitmq-config-data" not found Oct 07 08:38:13 crc kubenswrapper[5025]: E1007 08:38:13.412037 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:13 crc kubenswrapper[5025]: E1007 08:38:13.412084 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data podName:e5a4ae5a-0b64-481f-a54d-2263de0eae8e nodeName:}" failed. No retries permitted until 2025-10-07 08:38:14.412067896 +0000 UTC m=+1301.221382040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data") pod "rabbitmq-cell1-server-0" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e") : configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.451389 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjh7m\" (UniqueName: \"kubernetes.io/projected/b956a819-1c8f-471c-a738-fe4d78fbbb98-kube-api-access-gjh7m\") pod \"neutron0ee4-account-delete-8cznp\" (UID: \"b956a819-1c8f-471c-a738-fe4d78fbbb98\") " pod="openstack/neutron0ee4-account-delete-8cznp" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.476222 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gdh6c"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.477200 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gdh6c"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.583595 5025 generic.go:334] "Generic (PLEG): container finished" podID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerID="abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede" exitCode=2 Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.583637 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dd481497-9604-41b6-918a-a8ec9fd6ad92","Type":"ContainerDied","Data":"abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede"} Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.601756 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-d2nc2"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.659027 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0ee4-account-delete-8cznp" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.680602 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-d2nc2"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.713359 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell150ed-account-delete-mqxng"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.715039 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell150ed-account-delete-mqxng" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.785216 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell150ed-account-delete-mqxng"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.845826 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzndz\" (UniqueName: \"kubernetes.io/projected/041675fd-b2c1-4e9c-9b05-4a2aef6d329f-kube-api-access-pzndz\") pod \"novacell150ed-account-delete-mqxng\" (UID: \"041675fd-b2c1-4e9c-9b05-4a2aef6d329f\") " pod="openstack/novacell150ed-account-delete-mqxng" Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.891042 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-z4bnz"] Oct 07 08:38:13 crc kubenswrapper[5025]: I1007 08:38:13.950091 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzndz\" (UniqueName: \"kubernetes.io/projected/041675fd-b2c1-4e9c-9b05-4a2aef6d329f-kube-api-access-pzndz\") pod \"novacell150ed-account-delete-mqxng\" (UID: \"041675fd-b2c1-4e9c-9b05-4a2aef6d329f\") " pod="openstack/novacell150ed-account-delete-mqxng" Oct 07 08:38:13 crc kubenswrapper[5025]: E1007 08:38:13.950401 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 08:38:13 crc kubenswrapper[5025]: E1007 08:38:13.950464 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data podName:d46577dd-b38b-4b80-ad57-577629e648b8 nodeName:}" failed. No retries permitted until 2025-10-07 08:38:14.950446354 +0000 UTC m=+1301.759760498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data") pod "rabbitmq-server-0" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8") : configmap "rabbitmq-config-data" not found Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.001751 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzndz\" (UniqueName: \"kubernetes.io/projected/041675fd-b2c1-4e9c-9b05-4a2aef6d329f-kube-api-access-pzndz\") pod \"novacell150ed-account-delete-mqxng\" (UID: \"041675fd-b2c1-4e9c-9b05-4a2aef6d329f\") " pod="openstack/novacell150ed-account-delete-mqxng" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.014958 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e1a5d0-ce28-4495-937a-1aaccbcbb644" path="/var/lib/kubelet/pods/06e1a5d0-ce28-4495-937a-1aaccbcbb644/volumes" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.017616 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e980fa1-54dd-4f48-9a25-0b6090709927" path="/var/lib/kubelet/pods/5e980fa1-54dd-4f48-9a25-0b6090709927/volumes" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.063794 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2c43be6-62b6-4546-af1a-c3e1835da494" path="/var/lib/kubelet/pods/f2c43be6-62b6-4546-af1a-c3e1835da494/volumes" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.077611 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jl8nn"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.077847 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-z4bnz"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.081083 5025 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-bxmbm" message=< Oct 07 08:38:14 crc kubenswrapper[5025]: Exiting ovn-controller (1) [ OK ] Oct 07 08:38:14 crc kubenswrapper[5025]: > Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.081125 5025 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-bxmbm" podUID="d6889a88-68ef-4bf3-9e7e-78c6d84785ae" containerName="ovn-controller" containerID="cri-o://628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.081160 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-bxmbm" podUID="d6889a88-68ef-4bf3-9e7e-78c6d84785ae" containerName="ovn-controller" containerID="cri-o://628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.095892 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jl8nn"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.122377 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell150ed-account-delete-mqxng" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.129109 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-lmq8k"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.129322 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" podUID="219d3031-42b0-40df-98fc-46fe548a8f39" containerName="dnsmasq-dns" containerID="cri-o://22e9ac307d801dfcf026320bb8d6652c6f32dae3bd30c0d3a5d41209d16ed0b1" gracePeriod=10 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.179737 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.179995 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerName="cinder-scheduler" containerID="cri-o://37a99151a2993186ee570500792634aa31edd00f179d56fe2993e992303f092f" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.180407 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerName="probe" containerID="cri-o://1e3e97ecbee1eb5fa2d4e0c99b1a8d7a3e6632904a37d6f0cbcf4e9c9f40fc26" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.243186 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4mq6r"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.259656 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4mq6r"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.273403 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-g9bxn"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.284948 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-g9bxn"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.340611 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-czj54"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.348808 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.349142 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerName="cinder-api-log" containerID="cri-o://f3589738c71b80a2b66f3651e465e35c090e223c8027cb49f147e9e055629164" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.349258 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerName="cinder-api" containerID="cri-o://50b0aff959782d5056be05dce7539fc1d1b6491f125fcc23557b5dc2b17bcb0a" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.384115 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-czj54"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.399890 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37 is running failed: container process not found" containerID="628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.402286 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37 is running failed: container process not found" containerID="628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.404897 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37 is running failed: container process not found" containerID="628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.404948 5025 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-bxmbm" podUID="d6889a88-68ef-4bf3-9e7e-78c6d84785ae" containerName="ovn-controller" Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.427802 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.427883 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data podName:e5a4ae5a-0b64-481f-a54d-2263de0eae8e nodeName:}" failed. No retries permitted until 2025-10-07 08:38:16.427861879 +0000 UTC m=+1303.237176023 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data") pod "rabbitmq-cell1-server-0" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e") : configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.471275 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.473348 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerName="openstack-network-exporter" containerID="cri-o://ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382" gracePeriod=300 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.557832 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance4544-account-delete-qm7lc"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.582490 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerName="ovsdbserver-nb" containerID="cri-o://ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535" gracePeriod=300 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.593659 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.594226 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e8ac1818-6553-400f-91be-19b032aae626" containerName="openstack-network-exporter" containerID="cri-o://e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd" gracePeriod=300 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.602379 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.602606 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerName="glance-log" containerID="cri-o://6679ef0ed68a662d16557ba47e79ca743311dae66548a6f30fab1c7702863417" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.602983 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerName="glance-httpd" containerID="cri-o://4055606214f6b158a47c0d68c52aea2b1ad07643ccaa8ad47dd496cc1fca1a1e" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.616022 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.620156 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerName="glance-log" containerID="cri-o://577b784734dd3e48c70fc10ff3124e3ae64cd95fd597512882e35ecf5b2ba94f" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.620840 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerName="glance-httpd" containerID="cri-o://40bc944b02028fd2ab780bbc8f7785ef5771a301ce586feafccde56e9109c707" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.652655 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5899768569-tz2vj"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.652989 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5899768569-tz2vj" podUID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerName="neutron-api" containerID="cri-o://d0cadcb14b122e4544f81e647b5313ffc159d240c42a8c28574e6aa18ad57dbf" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.653584 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5899768569-tz2vj" podUID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerName="neutron-httpd" containerID="cri-o://76af13f38bf57b12ea10ac6d1ec8e591e15b727a96e03aeaffeda063e26481de" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.671089 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4544-account-delete-qm7lc" event={"ID":"28643c48-ee4c-4880-8901-9e3231c70b70","Type":"ContainerStarted","Data":"2ec633ad29ddbd1247f9ed3f3272e3565e6ac3ce0a7cc1b0057d1aa17070054b"} Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.671400 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cb69l_7f76b8e5-9257-4a0f-8067-ac36ccbe6711/openstack-network-exporter/0.log" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.671474 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.682731 5025 generic.go:334] "Generic (PLEG): container finished" podID="d13fcf27-6664-430e-b9ac-81ff65769a0c" containerID="39701f3461ec57f87ded44047ecffc35edf679b1b33f29a9dedea8b73829426d" exitCode=137 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.701913 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-684bd87b6d-w58z5"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.702207 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-684bd87b6d-w58z5" podUID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerName="placement-log" containerID="cri-o://9226a61a229bb35a76b9b1993c0b6dacb2bab79faa138dc6bf20e3d356c1fca2" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.702826 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-684bd87b6d-w58z5" podUID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerName="placement-api" containerID="cri-o://91ba013dc33b4d2f92059c4d4c66f039c2cb58683016a2882a32f59cc78bb607" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.713468 5025 generic.go:334] "Generic (PLEG): container finished" podID="219d3031-42b0-40df-98fc-46fe548a8f39" containerID="22e9ac307d801dfcf026320bb8d6652c6f32dae3bd30c0d3a5d41209d16ed0b1" exitCode=0 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.713578 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" event={"ID":"219d3031-42b0-40df-98fc-46fe548a8f39","Type":"ContainerDied","Data":"22e9ac307d801dfcf026320bb8d6652c6f32dae3bd30c0d3a5d41209d16ed0b1"} Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.732256 5025 generic.go:334] "Generic (PLEG): container finished" podID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerID="f3589738c71b80a2b66f3651e465e35c090e223c8027cb49f147e9e055629164" exitCode=143 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.732435 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.732501 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78","Type":"ContainerDied","Data":"f3589738c71b80a2b66f3651e465e35c090e223c8027cb49f147e9e055629164"} Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.733141 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-server" containerID="cri-o://4fd99f99ec9c844da88a9f393a6696daa800bfa88f16e03ae533ceadefd9d877" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.738069 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="swift-recon-cron" containerID="cri-o://2af861f7f90c2cae8a8ce332baaee8d44d3ff01b9e44974ad1dfe79434f2684b" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.738371 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="rsync" containerID="cri-o://9d3a9df9333411c34fcd1b1b33c246c5dc54f969d8f7b94b07b7feea250b14fb" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.738439 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-expirer" containerID="cri-o://a89cc0bbd617b24a619834b36ce895c374cd0e5df0288f882c9728d9252cd609" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.739203 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-updater" containerID="cri-o://17e6f1244bd23756c57aa4df7196524e152e5a068f7ec5464a9d47135f222dc7" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.739314 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-auditor" containerID="cri-o://99ce4f2eebf24b560f31691c8f1a2da1a5df6b8c8ed3d6626051ed204aea839a" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.739528 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-replicator" containerID="cri-o://aff21227028210b4f08d1a87f2fe84e1d162230abc5f64cc544641bfc6120b15" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.740450 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-server" containerID="cri-o://5f05c90b85b312fbeb24954f3c57716c7cd5932e4f7ceedff1daaffd861ae631" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.740528 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-updater" containerID="cri-o://76ba63d82ad08b0153c89278b305db830f608f9bed7c534bb409b72c53c3a06c" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.741061 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-auditor" containerID="cri-o://266ae7663ff2f3c8ca4fdece1db841dc278426b4fca4509d0d0233a384d325dd" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.741357 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-replicator" containerID="cri-o://49443cb71b40bc6abe49e9f54690d77bd06b9d3970bc17a852e8a4e30f9ba6b7" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.741428 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-server" containerID="cri-o://c93808ebcf734eeb951e224e6dd45872e982d63f46ec041bfc4d33193be2b705" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.741525 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-reaper" containerID="cri-o://15d39914545a4a069a9ea2c3664be41c34cbfd52c1e69df82087b4f1a1940cdd" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.741622 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-auditor" containerID="cri-o://8c0c209e0771ee6bccd63b38c1d1aaf2d92866662d731f3cc5eccf0667f3306a" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.742418 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-replicator" containerID="cri-o://9640e93cbab8bc8d914777065ae812b65e462076e84f96dabb2f52c7a65f973a" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.742705 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-config\") pod \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.742843 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovn-rundir\") pod \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.742884 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovs-rundir\") pod \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.742947 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-combined-ca-bundle\") pod \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.743055 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh8ww\" (UniqueName: \"kubernetes.io/projected/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-kube-api-access-zh8ww\") pod \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.743089 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-metrics-certs-tls-certs\") pod \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\" (UID: \"7f76b8e5-9257-4a0f-8067-ac36ccbe6711\") " Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.743700 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "7f76b8e5-9257-4a0f-8067-ac36ccbe6711" (UID: "7f76b8e5-9257-4a0f-8067-ac36ccbe6711"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.744764 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" containerID="cri-o://0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" gracePeriod=29 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.745758 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7f76b8e5-9257-4a0f-8067-ac36ccbe6711" (UID: "7f76b8e5-9257-4a0f-8067-ac36ccbe6711"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.747048 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-config" (OuterVolumeSpecName: "config") pod "7f76b8e5-9257-4a0f-8067-ac36ccbe6711" (UID: "7f76b8e5-9257-4a0f-8067-ac36ccbe6711"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.758483 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cb69l_7f76b8e5-9257-4a0f-8067-ac36ccbe6711/openstack-network-exporter/0.log" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.758568 5025 generic.go:334] "Generic (PLEG): container finished" podID="7f76b8e5-9257-4a0f-8067-ac36ccbe6711" containerID="c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20" exitCode=2 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.758633 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cb69l" event={"ID":"7f76b8e5-9257-4a0f-8067-ac36ccbe6711","Type":"ContainerDied","Data":"c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20"} Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.758671 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cb69l" event={"ID":"7f76b8e5-9257-4a0f-8067-ac36ccbe6711","Type":"ContainerDied","Data":"6d30afa32207026021934fd9f8733ffe56b970bdddee85e44462c5208c9fbeed"} Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.758692 5025 scope.go:117] "RemoveContainer" containerID="c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.758845 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cb69l" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.784310 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-kube-api-access-zh8ww" (OuterVolumeSpecName: "kube-api-access-zh8ww") pod "7f76b8e5-9257-4a0f-8067-ac36ccbe6711" (UID: "7f76b8e5-9257-4a0f-8067-ac36ccbe6711"). InnerVolumeSpecName "kube-api-access-zh8ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.784920 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e8ac1818-6553-400f-91be-19b032aae626" containerName="ovsdbserver-sb" containerID="cri-o://bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8" gracePeriod=300 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.799328 5025 generic.go:334] "Generic (PLEG): container finished" podID="d6889a88-68ef-4bf3-9e7e-78c6d84785ae" containerID="628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37" exitCode=0 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.799396 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm" event={"ID":"d6889a88-68ef-4bf3-9e7e-78c6d84785ae","Type":"ContainerDied","Data":"628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37"} Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.807296 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.835015 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9cc1-account-create-kh69v"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.849104 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.849127 5025 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.849143 5025 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.849157 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh8ww\" (UniqueName: \"kubernetes.io/projected/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-kube-api-access-zh8ww\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.850650 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9cc1-account-create-kh69v"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.868072 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lcwqv"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.882887 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cbqnb"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.900900 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.901775 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8 is running failed: container process not found" containerID="bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.902308 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8 is running failed: container process not found" containerID="bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.902334 5025 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="e8ac1818-6553-400f-91be-19b032aae626" containerName="ovsdbserver-sb" Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.904781 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4544-account-delete-qm7lc"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.927406 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cbqnb"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.940501 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lcwqv"] Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.952212 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 08:38:14 crc kubenswrapper[5025]: E1007 08:38:14.952272 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data podName:d46577dd-b38b-4b80-ad57-577629e648b8 nodeName:}" failed. No retries permitted until 2025-10-07 08:38:16.952254486 +0000 UTC m=+1303.761568630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data") pod "rabbitmq-server-0" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8") : configmap "rabbitmq-config-data" not found Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.953809 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4544-account-create-5pwmj"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.962734 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4544-account-create-5pwmj"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.980014 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.980249 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-log" containerID="cri-o://793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0" gracePeriod=30 Oct 07 08:38:14 crc kubenswrapper[5025]: I1007 08:38:14.980829 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-metadata" containerID="cri-o://c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.003190 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f76b8e5-9257-4a0f-8067-ac36ccbe6711" (UID: "7f76b8e5-9257-4a0f-8067-ac36ccbe6711"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.006954 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-znsxf"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.017640 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-znsxf"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.024461 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5528-account-create-nswld"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.035881 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5528-account-create-nswld"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.049563 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement5528-account-delete-wnxjq"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.058554 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" containerName="rabbitmq" containerID="cri-o://36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef" gracePeriod=604800 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.096094 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: E1007 08:38:15.096157 5025 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 07 08:38:15 crc kubenswrapper[5025]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 07 08:38:15 crc kubenswrapper[5025]: + source /usr/local/bin/container-scripts/functions Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNBridge=br-int Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNRemote=tcp:localhost:6642 Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNEncapType=geneve Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNAvailabilityZones= Oct 07 08:38:15 crc kubenswrapper[5025]: ++ EnableChassisAsGateway=true Oct 07 08:38:15 crc kubenswrapper[5025]: ++ PhysicalNetworks= Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNHostName= Oct 07 08:38:15 crc kubenswrapper[5025]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 07 08:38:15 crc kubenswrapper[5025]: ++ ovs_dir=/var/lib/openvswitch Oct 07 08:38:15 crc kubenswrapper[5025]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 07 08:38:15 crc kubenswrapper[5025]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 07 08:38:15 crc kubenswrapper[5025]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + sleep 0.5 Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + sleep 0.5 Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + sleep 0.5 Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + cleanup_ovsdb_server_semaphore Oct 07 08:38:15 crc kubenswrapper[5025]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 08:38:15 crc kubenswrapper[5025]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 07 08:38:15 crc kubenswrapper[5025]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-lms8w" message=< Oct 07 08:38:15 crc kubenswrapper[5025]: Exiting ovsdb-server (5) [ OK ] Oct 07 08:38:15 crc kubenswrapper[5025]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 07 08:38:15 crc kubenswrapper[5025]: + source /usr/local/bin/container-scripts/functions Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNBridge=br-int Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNRemote=tcp:localhost:6642 Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNEncapType=geneve Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNAvailabilityZones= Oct 07 08:38:15 crc kubenswrapper[5025]: ++ EnableChassisAsGateway=true Oct 07 08:38:15 crc kubenswrapper[5025]: ++ PhysicalNetworks= Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNHostName= Oct 07 08:38:15 crc kubenswrapper[5025]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 07 08:38:15 crc kubenswrapper[5025]: ++ ovs_dir=/var/lib/openvswitch Oct 07 08:38:15 crc kubenswrapper[5025]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 07 08:38:15 crc kubenswrapper[5025]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 07 08:38:15 crc kubenswrapper[5025]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + sleep 0.5 Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + sleep 0.5 Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + sleep 0.5 Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + cleanup_ovsdb_server_semaphore Oct 07 08:38:15 crc kubenswrapper[5025]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 08:38:15 crc kubenswrapper[5025]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 07 08:38:15 crc kubenswrapper[5025]: > Oct 07 08:38:15 crc kubenswrapper[5025]: E1007 08:38:15.106281 5025 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 07 08:38:15 crc kubenswrapper[5025]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 07 08:38:15 crc kubenswrapper[5025]: + source /usr/local/bin/container-scripts/functions Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNBridge=br-int Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNRemote=tcp:localhost:6642 Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNEncapType=geneve Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNAvailabilityZones= Oct 07 08:38:15 crc kubenswrapper[5025]: ++ EnableChassisAsGateway=true Oct 07 08:38:15 crc kubenswrapper[5025]: ++ PhysicalNetworks= Oct 07 08:38:15 crc kubenswrapper[5025]: ++ OVNHostName= Oct 07 08:38:15 crc kubenswrapper[5025]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 07 08:38:15 crc kubenswrapper[5025]: ++ ovs_dir=/var/lib/openvswitch Oct 07 08:38:15 crc kubenswrapper[5025]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 07 08:38:15 crc kubenswrapper[5025]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 07 08:38:15 crc kubenswrapper[5025]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + sleep 0.5 Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + sleep 0.5 Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + sleep 0.5 Oct 07 08:38:15 crc kubenswrapper[5025]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 08:38:15 crc kubenswrapper[5025]: + cleanup_ovsdb_server_semaphore Oct 07 08:38:15 crc kubenswrapper[5025]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 08:38:15 crc kubenswrapper[5025]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 07 08:38:15 crc kubenswrapper[5025]: > pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" containerID="cri-o://b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.107578 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" containerID="cri-o://b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" gracePeriod=28 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.126835 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-pmhbz"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.143821 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-pmhbz"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.160804 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c4bb756cf-6b9fc"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.161016 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" podUID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerName="proxy-httpd" containerID="cri-o://e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.161435 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" podUID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerName="proxy-server" containerID="cri-o://fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.176982 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0ee4-account-create-wtml5"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.177284 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7f76b8e5-9257-4a0f-8067-ac36ccbe6711" (UID: "7f76b8e5-9257-4a0f-8067-ac36ccbe6711"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.185965 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0ee4-account-create-wtml5"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.197041 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron0ee4-account-delete-8cznp"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.204634 5025 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f76b8e5-9257-4a0f-8067-ac36ccbe6711-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.206058 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.223728 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-dbbfh"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.233814 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.240945 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-dbbfh"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.256065 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.256331 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-log" containerID="cri-o://231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.257768 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-api" containerID="cri-o://049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.271236 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-696a-account-create-ht4mf"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.301840 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-696a-account-create-ht4mf"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.305993 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run-ovn\") pod \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.306133 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knrhf\" (UniqueName: \"kubernetes.io/projected/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-kube-api-access-knrhf\") pod \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.306206 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run\") pod \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.306228 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-ovn-controller-tls-certs\") pod \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.306303 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-log-ovn\") pod \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.306392 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-scripts\") pod \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.306454 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-combined-ca-bundle\") pod \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\" (UID: \"d6889a88-68ef-4bf3-9e7e-78c6d84785ae\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.307338 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run" (OuterVolumeSpecName: "var-run") pod "d6889a88-68ef-4bf3-9e7e-78c6d84785ae" (UID: "d6889a88-68ef-4bf3-9e7e-78c6d84785ae"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.307380 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d6889a88-68ef-4bf3-9e7e-78c6d84785ae" (UID: "d6889a88-68ef-4bf3-9e7e-78c6d84785ae"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.307424 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d6889a88-68ef-4bf3-9e7e-78c6d84785ae" (UID: "d6889a88-68ef-4bf3-9e7e-78c6d84785ae"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.309395 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-scripts" (OuterVolumeSpecName: "scripts") pod "d6889a88-68ef-4bf3-9e7e-78c6d84785ae" (UID: "d6889a88-68ef-4bf3-9e7e-78c6d84785ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.321398 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-kube-api-access-knrhf" (OuterVolumeSpecName: "kube-api-access-knrhf") pod "d6889a88-68ef-4bf3-9e7e-78c6d84785ae" (UID: "d6889a88-68ef-4bf3-9e7e-78c6d84785ae"). InnerVolumeSpecName "kube-api-access-knrhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.352335 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6889a88-68ef-4bf3-9e7e-78c6d84785ae" (UID: "d6889a88-68ef-4bf3-9e7e-78c6d84785ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.359960 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-w5ggj"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.386252 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-w5ggj"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.409102 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knrhf\" (UniqueName: \"kubernetes.io/projected/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-kube-api-access-knrhf\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.409137 5025 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.409150 5025 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.409162 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.409173 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.409185 5025 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.414138 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.422960 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7d54-account-create-5kmps"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.430364 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7d54-account-create-5kmps"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.433737 5025 scope.go:117] "RemoveContainer" containerID="c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20" Oct 07 08:38:15 crc kubenswrapper[5025]: E1007 08:38:15.434691 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20\": container with ID starting with c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20 not found: ID does not exist" containerID="c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.434720 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20"} err="failed to get container status \"c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20\": rpc error: code = NotFound desc = could not find container \"c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20\": container with ID starting with c6b426dcd9f533a44c9fe8cd3f13790da46148d4c171d8d1980f367d0b388d20 not found: ID does not exist" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.436470 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-pc6np"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.437157 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.443561 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-pc6np"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.450601 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderb478-account-delete-p6rj5"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.459186 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7c8774d5b7-qj8g5"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.459433 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" podUID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerName="barbican-worker-log" containerID="cri-o://878aa4d7a336162ac22f34d19599bf58c73742f602e586a96453e05637fc3934" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.460734 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" podUID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerName="barbican-worker" containerID="cri-o://1d466d120650f5aa29db6aa35c48e75103701a48199f9d6ad44171efa028e65a" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.470199 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell150ed-account-delete-mqxng"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.480262 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-50ed-account-create-dhzfq"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.480992 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "d6889a88-68ef-4bf3-9e7e-78c6d84785ae" (UID: "d6889a88-68ef-4bf3-9e7e-78c6d84785ae"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.483425 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.486277 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5495b78bc8-wbf5t"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.486621 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" podUID="51f36c18-61cd-43d1-98a6-b569197c9382" containerName="barbican-keystone-listener-log" containerID="cri-o://b7e10267acefb906d0890b494854e97353644178df770bee745cfcda0052c6b4" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.487029 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" podUID="51f36c18-61cd-43d1-98a6-b569197c9382" containerName="barbican-keystone-listener" containerID="cri-o://541d0073c514c96dd102a8451f807366e43a748b3da042e4f4deb995f526e4db" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.492938 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-50ed-account-create-dhzfq"] Oct 07 08:38:15 crc kubenswrapper[5025]: E1007 08:38:15.494452 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-conmon-a89cc0bbd617b24a619834b36ce895c374cd0e5df0288f882c9728d9252cd609.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-8c0c209e0771ee6bccd63b38c1d1aaf2d92866662d731f3cc5eccf0667f3306a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-aff21227028210b4f08d1a87f2fe84e1d162230abc5f64cc544641bfc6120b15.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72a63b89_4f79_41a7_9e9a_e3c464d015a4.slice/crio-231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc6c6b9c_b8b4_443c_a1d9_d829b8f8e9c0.slice/crio-76af13f38bf57b12ea10ac6d1ec8e591e15b727a96e03aeaffeda063e26481de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc6c6b9c_b8b4_443c_a1d9_d829b8f8e9c0.slice/crio-conmon-76af13f38bf57b12ea10ac6d1ec8e591e15b727a96e03aeaffeda063e26481de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fc34125_4736_4467_bf0e_ec3b211e6d13.slice/crio-conmon-1e3e97ecbee1eb5fa2d4e0c99b1a8d7a3e6632904a37d6f0cbcf4e9c9f40fc26.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-5f05c90b85b312fbeb24954f3c57716c7cd5932e4f7ceedff1daaffd861ae631.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-conmon-5f05c90b85b312fbeb24954f3c57716c7cd5932e4f7ceedff1daaffd861ae631.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fc34125_4736_4467_bf0e_ec3b211e6d13.slice/crio-1e3e97ecbee1eb5fa2d4e0c99b1a8d7a3e6632904a37d6f0cbcf4e9c9f40fc26.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-conmon-49443cb71b40bc6abe49e9f54690d77bd06b9d3970bc17a852e8a4e30f9ba6b7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-9d3a9df9333411c34fcd1b1b33c246c5dc54f969d8f7b94b07b7feea250b14fb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-conmon-76ba63d82ad08b0153c89278b305db830f608f9bed7c534bb409b72c53c3a06c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-4fd99f99ec9c844da88a9f393a6696daa800bfa88f16e03ae533ceadefd9d877.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f76b8e5_9257_4a0f_8067_ac36ccbe6711.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-conmon-c93808ebcf734eeb951e224e6dd45872e982d63f46ec041bfc4d33193be2b705.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a34d85b_3d52_4c22_bb89_ca22d9f45eeb.slice/crio-e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-conmon-17e6f1244bd23756c57aa4df7196524e152e5a068f7ec5464a9d47135f222dc7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f76b8e5_9257_4a0f_8067_ac36ccbe6711.slice/crio-6d30afa32207026021934fd9f8733ffe56b970bdddee85e44462c5208c9fbeed\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a8356fc_f4bd_4853_a3f3_0d44ab20612b.slice/crio-878aa4d7a336162ac22f34d19599bf58c73742f602e586a96453e05637fc3934.scope\": RecentStats: unable to find data in memory cache]" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.499149 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56bb8cc8-59x6x"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.499433 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56bb8cc8-59x6x" podUID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerName="barbican-api-log" containerID="cri-o://6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.500987 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56bb8cc8-59x6x" podUID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerName="barbican-api" containerID="cri-o://7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.509184 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.510722 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-swift-storage-0\") pod \"219d3031-42b0-40df-98fc-46fe548a8f39\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.510768 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-config\") pod \"219d3031-42b0-40df-98fc-46fe548a8f39\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.510832 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-nb\") pod \"219d3031-42b0-40df-98fc-46fe548a8f39\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.510882 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-combined-ca-bundle\") pod \"d13fcf27-6664-430e-b9ac-81ff65769a0c\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.510915 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-svc\") pod \"219d3031-42b0-40df-98fc-46fe548a8f39\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.511126 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-sb\") pod \"219d3031-42b0-40df-98fc-46fe548a8f39\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.511156 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf5lv\" (UniqueName: \"kubernetes.io/projected/d13fcf27-6664-430e-b9ac-81ff65769a0c-kube-api-access-xf5lv\") pod \"d13fcf27-6664-430e-b9ac-81ff65769a0c\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.511262 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config\") pod \"d13fcf27-6664-430e-b9ac-81ff65769a0c\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.511294 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config-secret\") pod \"d13fcf27-6664-430e-b9ac-81ff65769a0c\" (UID: \"d13fcf27-6664-430e-b9ac-81ff65769a0c\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.511629 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frr77\" (UniqueName: \"kubernetes.io/projected/219d3031-42b0-40df-98fc-46fe548a8f39-kube-api-access-frr77\") pod \"219d3031-42b0-40df-98fc-46fe548a8f39\" (UID: \"219d3031-42b0-40df-98fc-46fe548a8f39\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.512262 5025 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6889a88-68ef-4bf3-9e7e-78c6d84785ae-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.533956 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="582efaa6-0e81-462a-813f-7ba1cf9c6fc2" containerName="galera" containerID="cri-o://681d0f55b346cbe735938996d113e2740ed0c5031feeac188e0b54605836e911" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.535209 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.535394 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8f770079-7ba4-4076-bd29-15fab31fc53d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.540718 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219d3031-42b0-40df-98fc-46fe548a8f39-kube-api-access-frr77" (OuterVolumeSpecName: "kube-api-access-frr77") pod "219d3031-42b0-40df-98fc-46fe548a8f39" (UID: "219d3031-42b0-40df-98fc-46fe548a8f39"). InnerVolumeSpecName "kube-api-access-frr77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.545672 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13fcf27-6664-430e-b9ac-81ff65769a0c-kube-api-access-xf5lv" (OuterVolumeSpecName: "kube-api-access-xf5lv") pod "d13fcf27-6664-430e-b9ac-81ff65769a0c" (UID: "d13fcf27-6664-430e-b9ac-81ff65769a0c"). InnerVolumeSpecName "kube-api-access-xf5lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.545851 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pw82v"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.554886 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pw82v"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.558375 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.558758 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="073c6d0a-3fea-4e0b-8f8f-f58613b4bf23" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8fadf8cac92c695c60339b695323e8872f1a70c0fa6297fd6b45f17e85f81906" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.568761 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g6spt"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.592805 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g6spt"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.608740 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.608944 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="5db22b1d-c7bd-4bc9-b30b-3271da7741e1" containerName="nova-cell1-conductor-conductor" containerID="cri-o://fe88f33ff0949c6497d1f403bf9b543b6f9b6a8feba3d54578e33f486413a22f" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.613570 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf5lv\" (UniqueName: \"kubernetes.io/projected/d13fcf27-6664-430e-b9ac-81ff65769a0c-kube-api-access-xf5lv\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.613593 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frr77\" (UniqueName: \"kubernetes.io/projected/219d3031-42b0-40df-98fc-46fe548a8f39-kube-api-access-frr77\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.618590 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.618811 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8691f972-205f-470b-b300-40c32106704b" containerName="nova-scheduler-scheduler" containerID="cri-o://8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700" gracePeriod=30 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.639717 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-cb69l"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.653640 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-cb69l"] Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.668316 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d46577dd-b38b-4b80-ad57-577629e648b8" containerName="rabbitmq" containerID="cri-o://73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431" gracePeriod=604800 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.790475 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8e01de5b-7e9d-4f70-91d9-d55165a8eb32/ovsdbserver-nb/0.log" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.790557 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.799435 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d13fcf27-6664-430e-b9ac-81ff65769a0c" (UID: "d13fcf27-6664-430e-b9ac-81ff65769a0c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.801050 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e8ac1818-6553-400f-91be-19b032aae626/ovsdbserver-sb/0.log" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.801137 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.818525 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "219d3031-42b0-40df-98fc-46fe548a8f39" (UID: "219d3031-42b0-40df-98fc-46fe548a8f39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.818599 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d13fcf27-6664-430e-b9ac-81ff65769a0c" (UID: "d13fcf27-6664-430e-b9ac-81ff65769a0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.826599 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.826671 5025 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.826687 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.829193 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderb478-account-delete-p6rj5" event={"ID":"c87571c0-a1b9-4187-a60c-dd66d9fbb301","Type":"ContainerStarted","Data":"1628308fe842ebe329ac5dc4f3807a63f75b4e21c43dc2faafeb1c8fd7665e62"} Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.846573 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" event={"ID":"219d3031-42b0-40df-98fc-46fe548a8f39","Type":"ContainerDied","Data":"a219c71588c1e0ad27128812d0496c6d2176b5c5102a180c840290910bb56f16"} Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.846622 5025 scope.go:117] "RemoveContainer" containerID="22e9ac307d801dfcf026320bb8d6652c6f32dae3bd30c0d3a5d41209d16ed0b1" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.846761 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-lmq8k" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.855829 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8e01de5b-7e9d-4f70-91d9-d55165a8eb32/ovsdbserver-nb/0.log" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.855881 5025 generic.go:334] "Generic (PLEG): container finished" podID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerID="ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382" exitCode=2 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.855900 5025 generic.go:334] "Generic (PLEG): container finished" podID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerID="ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535" exitCode=143 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.855997 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.856203 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8e01de5b-7e9d-4f70-91d9-d55165a8eb32","Type":"ContainerDied","Data":"ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382"} Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.856326 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8e01de5b-7e9d-4f70-91d9-d55165a8eb32","Type":"ContainerDied","Data":"ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535"} Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.856399 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8e01de5b-7e9d-4f70-91d9-d55165a8eb32","Type":"ContainerDied","Data":"3945860612bcb71718e4cac07ea29b55c58ff38a86f4e0bb00a3d7968363e875"} Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.885459 5025 generic.go:334] "Generic (PLEG): container finished" podID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerID="9226a61a229bb35a76b9b1993c0b6dacb2bab79faa138dc6bf20e3d356c1fca2" exitCode=143 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.885533 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-684bd87b6d-w58z5" event={"ID":"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8","Type":"ContainerDied","Data":"9226a61a229bb35a76b9b1993c0b6dacb2bab79faa138dc6bf20e3d356c1fca2"} Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.890529 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d13fcf27-6664-430e-b9ac-81ff65769a0c" (UID: "d13fcf27-6664-430e-b9ac-81ff65769a0c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.898329 5025 generic.go:334] "Generic (PLEG): container finished" podID="51f36c18-61cd-43d1-98a6-b569197c9382" containerID="b7e10267acefb906d0890b494854e97353644178df770bee745cfcda0052c6b4" exitCode=143 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.898406 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" event={"ID":"51f36c18-61cd-43d1-98a6-b569197c9382","Type":"ContainerDied","Data":"b7e10267acefb906d0890b494854e97353644178df770bee745cfcda0052c6b4"} Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.910690 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "219d3031-42b0-40df-98fc-46fe548a8f39" (UID: "219d3031-42b0-40df-98fc-46fe548a8f39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.924810 5025 generic.go:334] "Generic (PLEG): container finished" podID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerID="1e3e97ecbee1eb5fa2d4e0c99b1a8d7a3e6632904a37d6f0cbcf4e9c9f40fc26" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.929972 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-scripts\") pod \"e8ac1818-6553-400f-91be-19b032aae626\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930102 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-scripts\") pod \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930196 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930229 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-metrics-certs-tls-certs\") pod \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930307 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-combined-ca-bundle\") pod \"e8ac1818-6553-400f-91be-19b032aae626\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930332 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e8ac1818-6553-400f-91be-19b032aae626\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930384 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-config\") pod \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930409 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-metrics-certs-tls-certs\") pod \"e8ac1818-6553-400f-91be-19b032aae626\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930435 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdbserver-nb-tls-certs\") pod \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930477 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdb-rundir\") pod \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930504 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n89z9\" (UniqueName: \"kubernetes.io/projected/e8ac1818-6553-400f-91be-19b032aae626-kube-api-access-n89z9\") pod \"e8ac1818-6553-400f-91be-19b032aae626\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930564 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-config\") pod \"e8ac1818-6553-400f-91be-19b032aae626\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930600 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d64s2\" (UniqueName: \"kubernetes.io/projected/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-kube-api-access-d64s2\") pod \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930628 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-ovsdbserver-sb-tls-certs\") pod \"e8ac1818-6553-400f-91be-19b032aae626\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930655 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-combined-ca-bundle\") pod \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\" (UID: \"8e01de5b-7e9d-4f70-91d9-d55165a8eb32\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.930695 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac1818-6553-400f-91be-19b032aae626-ovsdb-rundir\") pod \"e8ac1818-6553-400f-91be-19b032aae626\" (UID: \"e8ac1818-6553-400f-91be-19b032aae626\") " Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.936378 5025 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d13fcf27-6664-430e-b9ac-81ff65769a0c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.936418 5025 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.939826 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ac1818-6553-400f-91be-19b032aae626-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e8ac1818-6553-400f-91be-19b032aae626" (UID: "e8ac1818-6553-400f-91be-19b032aae626"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.940158 5025 generic.go:334] "Generic (PLEG): container finished" podID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerID="793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0" exitCode=143 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.940518 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-config" (OuterVolumeSpecName: "config") pod "e8ac1818-6553-400f-91be-19b032aae626" (UID: "e8ac1818-6553-400f-91be-19b032aae626"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.940763 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "8e01de5b-7e9d-4f70-91d9-d55165a8eb32" (UID: "8e01de5b-7e9d-4f70-91d9-d55165a8eb32"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.942168 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-scripts" (OuterVolumeSpecName: "scripts") pod "8e01de5b-7e9d-4f70-91d9-d55165a8eb32" (UID: "8e01de5b-7e9d-4f70-91d9-d55165a8eb32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.943679 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-scripts" (OuterVolumeSpecName: "scripts") pod "e8ac1818-6553-400f-91be-19b032aae626" (UID: "e8ac1818-6553-400f-91be-19b032aae626"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.950079 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "219d3031-42b0-40df-98fc-46fe548a8f39" (UID: "219d3031-42b0-40df-98fc-46fe548a8f39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.952403 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "e8ac1818-6553-400f-91be-19b032aae626" (UID: "e8ac1818-6553-400f-91be-19b032aae626"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.953819 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-kube-api-access-d64s2" (OuterVolumeSpecName: "kube-api-access-d64s2") pod "8e01de5b-7e9d-4f70-91d9-d55165a8eb32" (UID: "8e01de5b-7e9d-4f70-91d9-d55165a8eb32"). InnerVolumeSpecName "kube-api-access-d64s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.961926 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-config" (OuterVolumeSpecName: "config") pod "8e01de5b-7e9d-4f70-91d9-d55165a8eb32" (UID: "8e01de5b-7e9d-4f70-91d9-d55165a8eb32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.962789 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="108122fc-a610-4d25-8945-5e30f770da7f" path="/var/lib/kubelet/pods/108122fc-a610-4d25-8945-5e30f770da7f/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.963826 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e6f419-f7bd-44d0-a377-488d155d7adc" path="/var/lib/kubelet/pods/25e6f419-f7bd-44d0-a377-488d155d7adc/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.964587 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40cd73d1-37d6-4334-9962-209f68f8e283" path="/var/lib/kubelet/pods/40cd73d1-37d6-4334-9962-209f68f8e283/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.965352 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4520ec77-3c49-4527-a716-90a88f6ec243" path="/var/lib/kubelet/pods/4520ec77-3c49-4527-a716-90a88f6ec243/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.966640 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62224827-cf6b-4de5-b315-92293847f02f" path="/var/lib/kubelet/pods/62224827-cf6b-4de5-b315-92293847f02f/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.967299 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9ff352-f9b1-4458-841f-08b02c493ab1" path="/var/lib/kubelet/pods/6d9ff352-f9b1-4458-841f-08b02c493ab1/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.968605 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "8e01de5b-7e9d-4f70-91d9-d55165a8eb32" (UID: "8e01de5b-7e9d-4f70-91d9-d55165a8eb32"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.972074 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ac1818-6553-400f-91be-19b032aae626-kube-api-access-n89z9" (OuterVolumeSpecName: "kube-api-access-n89z9") pod "e8ac1818-6553-400f-91be-19b032aae626" (UID: "e8ac1818-6553-400f-91be-19b032aae626"). InnerVolumeSpecName "kube-api-access-n89z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.973646 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792101f8-2389-423a-b5a9-f10d2e141f0d" path="/var/lib/kubelet/pods/792101f8-2389-423a-b5a9-f10d2e141f0d/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.978877 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f76b8e5-9257-4a0f-8067-ac36ccbe6711" path="/var/lib/kubelet/pods/7f76b8e5-9257-4a0f-8067-ac36ccbe6711/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.979792 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72ad49a-646a-40cf-a5bb-7bdecdf2153c" path="/var/lib/kubelet/pods/a72ad49a-646a-40cf-a5bb-7bdecdf2153c/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.982506 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a910c5fd-a686-4b9f-b1bd-6cdcf4641013" path="/var/lib/kubelet/pods/a910c5fd-a686-4b9f-b1bd-6cdcf4641013/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.983160 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa176e96-74ca-47df-9388-08146166a449" path="/var/lib/kubelet/pods/aa176e96-74ca-47df-9388-08146166a449/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.984020 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afea2d9f-2278-427e-a2aa-5f034f08a1bd" path="/var/lib/kubelet/pods/afea2d9f-2278-427e-a2aa-5f034f08a1bd/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.985443 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11" path="/var/lib/kubelet/pods/bb4e8f2b-70ba-49fe-b21e-9343ccaa1a11/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.986159 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5" path="/var/lib/kubelet/pods/bbaaccc5-06cb-40e9-9d17-bb9f11b6f8a5/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.987197 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf250487-bcbd-4fc3-9d87-8e813ee7b39a" path="/var/lib/kubelet/pods/bf250487-bcbd-4fc3-9d87-8e813ee7b39a/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.992064 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c434c871-bc74-45ae-b5ed-810a796622d9" path="/var/lib/kubelet/pods/c434c871-bc74-45ae-b5ed-810a796622d9/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.992837 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e34060-f0fe-4d2e-8892-4530baf597b7" path="/var/lib/kubelet/pods/c4e34060-f0fe-4d2e-8892-4530baf597b7/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994640 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="9d3a9df9333411c34fcd1b1b33c246c5dc54f969d8f7b94b07b7feea250b14fb" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994662 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="a89cc0bbd617b24a619834b36ce895c374cd0e5df0288f882c9728d9252cd609" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994672 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="17e6f1244bd23756c57aa4df7196524e152e5a068f7ec5464a9d47135f222dc7" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994679 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="99ce4f2eebf24b560f31691c8f1a2da1a5df6b8c8ed3d6626051ed204aea839a" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994686 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="aff21227028210b4f08d1a87f2fe84e1d162230abc5f64cc544641bfc6120b15" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994692 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="5f05c90b85b312fbeb24954f3c57716c7cd5932e4f7ceedff1daaffd861ae631" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994699 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="76ba63d82ad08b0153c89278b305db830f608f9bed7c534bb409b72c53c3a06c" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994704 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="266ae7663ff2f3c8ca4fdece1db841dc278426b4fca4509d0d0233a384d325dd" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994712 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="49443cb71b40bc6abe49e9f54690d77bd06b9d3970bc17a852e8a4e30f9ba6b7" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994718 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="c93808ebcf734eeb951e224e6dd45872e982d63f46ec041bfc4d33193be2b705" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994724 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="15d39914545a4a069a9ea2c3664be41c34cbfd52c1e69df82087b4f1a1940cdd" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994731 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="8c0c209e0771ee6bccd63b38c1d1aaf2d92866662d731f3cc5eccf0667f3306a" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994738 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="9640e93cbab8bc8d914777065ae812b65e462076e84f96dabb2f52c7a65f973a" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.994744 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="4fd99f99ec9c844da88a9f393a6696daa800bfa88f16e03ae533ceadefd9d877" exitCode=0 Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.995093 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13fcf27-6664-430e-b9ac-81ff65769a0c" path="/var/lib/kubelet/pods/d13fcf27-6664-430e-b9ac-81ff65769a0c/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.996883 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d202ac0c-e468-4038-8232-a4e1812d3698" path="/var/lib/kubelet/pods/d202ac0c-e468-4038-8232-a4e1812d3698/volumes" Oct 07 08:38:15 crc kubenswrapper[5025]: I1007 08:38:15.998077 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65b533a-25ed-4f95-a2f8-84ee04697636" path="/var/lib/kubelet/pods/d65b533a-25ed-4f95-a2f8-84ee04697636/volumes" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.007946 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8aff795-d588-41a3-8921-44f29b96c5f1" path="/var/lib/kubelet/pods/e8aff795-d588-41a3-8921-44f29b96c5f1/volumes" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.029887 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d10c21-f6a2-43b1-98da-e882d1320a5f" path="/var/lib/kubelet/pods/f3d10c21-f6a2-43b1-98da-e882d1320a5f/volumes" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.030386 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fffe9def-a18d-4a98-bb4c-f47faeb2fae8" path="/var/lib/kubelet/pods/fffe9def-a18d-4a98-bb4c-f47faeb2fae8/volumes" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.035575 5025 generic.go:334] "Generic (PLEG): container finished" podID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerID="6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa" exitCode=143 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.051507 5025 generic.go:334] "Generic (PLEG): container finished" podID="28643c48-ee4c-4880-8901-9e3231c70b70" containerID="be990c38f8204943f8049baf27de19af16319b683aba310036475c95b583ac65" exitCode=0 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.052069 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac1818-6553-400f-91be-19b032aae626-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053445 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053476 5025 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053487 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053518 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053533 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053556 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053566 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053578 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n89z9\" (UniqueName: \"kubernetes.io/projected/e8ac1818-6553-400f-91be-19b032aae626-kube-api-access-n89z9\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053587 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ac1818-6553-400f-91be-19b032aae626-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.053596 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d64s2\" (UniqueName: \"kubernetes.io/projected/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-kube-api-access-d64s2\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.060219 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e8ac1818-6553-400f-91be-19b032aae626/ovsdbserver-sb/0.log" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.060265 5025 generic.go:334] "Generic (PLEG): container finished" podID="e8ac1818-6553-400f-91be-19b032aae626" containerID="e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd" exitCode=2 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.060287 5025 generic.go:334] "Generic (PLEG): container finished" podID="e8ac1818-6553-400f-91be-19b032aae626" containerID="bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8" exitCode=143 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.060390 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.063176 5025 generic.go:334] "Generic (PLEG): container finished" podID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerID="231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263" exitCode=143 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.081530 5025 generic.go:334] "Generic (PLEG): container finished" podID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerID="6679ef0ed68a662d16557ba47e79ca743311dae66548a6f30fab1c7702863417" exitCode=143 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.094996 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "219d3031-42b0-40df-98fc-46fe548a8f39" (UID: "219d3031-42b0-40df-98fc-46fe548a8f39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.095399 5025 generic.go:334] "Generic (PLEG): container finished" podID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerID="76af13f38bf57b12ea10ac6d1ec8e591e15b727a96e03aeaffeda063e26481de" exitCode=0 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.106103 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-config" (OuterVolumeSpecName: "config") pod "219d3031-42b0-40df-98fc-46fe548a8f39" (UID: "219d3031-42b0-40df-98fc-46fe548a8f39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.119693 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e01de5b-7e9d-4f70-91d9-d55165a8eb32" (UID: "8e01de5b-7e9d-4f70-91d9-d55165a8eb32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.119828 5025 generic.go:334] "Generic (PLEG): container finished" podID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerID="e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e" exitCode=0 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.134763 5025 generic.go:334] "Generic (PLEG): container finished" podID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" exitCode=0 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.142057 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bxmbm" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.143973 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.157164 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.157219 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.157234 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/219d3031-42b0-40df-98fc-46fe548a8f39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.157246 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.157450 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.165413 5025 generic.go:334] "Generic (PLEG): container finished" podID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerID="577b784734dd3e48c70fc10ff3124e3ae64cd95fd597512882e35ecf5b2ba94f" exitCode=143 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.177835 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8ac1818-6553-400f-91be-19b032aae626" (UID: "e8ac1818-6553-400f-91be-19b032aae626"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.181997 5025 generic.go:334] "Generic (PLEG): container finished" podID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerID="878aa4d7a336162ac22f34d19599bf58c73742f602e586a96453e05637fc3934" exitCode=143 Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.217347 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.259878 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.259912 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.298549 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "8e01de5b-7e9d-4f70-91d9-d55165a8eb32" (UID: "8e01de5b-7e9d-4f70-91d9-d55165a8eb32"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.314706 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "e8ac1818-6553-400f-91be-19b032aae626" (UID: "e8ac1818-6553-400f-91be-19b032aae626"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.322288 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8e01de5b-7e9d-4f70-91d9-d55165a8eb32" (UID: "8e01de5b-7e9d-4f70-91d9-d55165a8eb32"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.340668 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e8ac1818-6553-400f-91be-19b032aae626" (UID: "e8ac1818-6553-400f-91be-19b032aae626"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.363568 5025 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.363589 5025 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.363598 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e01de5b-7e9d-4f70-91d9-d55165a8eb32-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.363606 5025 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac1818-6553-400f-91be-19b032aae626-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: E1007 08:38:16.467967 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:16 crc kubenswrapper[5025]: E1007 08:38:16.468489 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data podName:e5a4ae5a-0b64-481f-a54d-2263de0eae8e nodeName:}" failed. No retries permitted until 2025-10-07 08:38:20.468466861 +0000 UTC m=+1307.277781005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data") pod "rabbitmq-cell1-server-0" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e") : configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.486865 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron0ee4-account-delete-8cznp"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.486904 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement5528-account-delete-wnxjq"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.486917 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fc34125-4736-4467-bf0e-ec3b211e6d13","Type":"ContainerDied","Data":"1e3e97ecbee1eb5fa2d4e0c99b1a8d7a3e6632904a37d6f0cbcf4e9c9f40fc26"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.486943 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell150ed-account-delete-mqxng"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.486963 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a892055-924e-4e4f-a625-b15b8a39c4bf","Type":"ContainerDied","Data":"793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.486978 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"9d3a9df9333411c34fcd1b1b33c246c5dc54f969d8f7b94b07b7feea250b14fb"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.486989 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"a89cc0bbd617b24a619834b36ce895c374cd0e5df0288f882c9728d9252cd609"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.486998 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"17e6f1244bd23756c57aa4df7196524e152e5a068f7ec5464a9d47135f222dc7"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487010 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"99ce4f2eebf24b560f31691c8f1a2da1a5df6b8c8ed3d6626051ed204aea839a"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487018 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"aff21227028210b4f08d1a87f2fe84e1d162230abc5f64cc544641bfc6120b15"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487027 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"5f05c90b85b312fbeb24954f3c57716c7cd5932e4f7ceedff1daaffd861ae631"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487035 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"76ba63d82ad08b0153c89278b305db830f608f9bed7c534bb409b72c53c3a06c"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487042 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"266ae7663ff2f3c8ca4fdece1db841dc278426b4fca4509d0d0233a384d325dd"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487052 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"49443cb71b40bc6abe49e9f54690d77bd06b9d3970bc17a852e8a4e30f9ba6b7"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487061 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"c93808ebcf734eeb951e224e6dd45872e982d63f46ec041bfc4d33193be2b705"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487070 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"15d39914545a4a069a9ea2c3664be41c34cbfd52c1e69df82087b4f1a1940cdd"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487080 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"8c0c209e0771ee6bccd63b38c1d1aaf2d92866662d731f3cc5eccf0667f3306a"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487092 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"9640e93cbab8bc8d914777065ae812b65e462076e84f96dabb2f52c7a65f973a"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487101 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"4fd99f99ec9c844da88a9f393a6696daa800bfa88f16e03ae533ceadefd9d877"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487111 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bb8cc8-59x6x" event={"ID":"f66554ef-2742-4c2b-a6e4-58b8377f03fa","Type":"ContainerDied","Data":"6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487132 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4544-account-delete-qm7lc" event={"ID":"28643c48-ee4c-4880-8901-9e3231c70b70","Type":"ContainerDied","Data":"be990c38f8204943f8049baf27de19af16319b683aba310036475c95b583ac65"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.491488 5025 scope.go:117] "RemoveContainer" containerID="abb39eedae6ddb7273b901a428a0238e8dd161207695fb279307852070147dbe" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.487535 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8ac1818-6553-400f-91be-19b032aae626","Type":"ContainerDied","Data":"e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497708 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8ac1818-6553-400f-91be-19b032aae626","Type":"ContainerDied","Data":"bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497723 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8ac1818-6553-400f-91be-19b032aae626","Type":"ContainerDied","Data":"86561299917d5ef8ffba02ecb55fb4e8d3c4885b4082318aebefbe817c0e689e"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497734 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72a63b89-4f79-41a7-9e9a-e3c464d015a4","Type":"ContainerDied","Data":"231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497747 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"988f0f87-7a01-4aac-a874-8e885b2f83cb","Type":"ContainerDied","Data":"6679ef0ed68a662d16557ba47e79ca743311dae66548a6f30fab1c7702863417"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497761 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899768569-tz2vj" event={"ID":"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0","Type":"ContainerDied","Data":"76af13f38bf57b12ea10ac6d1ec8e591e15b727a96e03aeaffeda063e26481de"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497774 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" event={"ID":"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb","Type":"ContainerDied","Data":"e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497786 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lms8w" event={"ID":"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b","Type":"ContainerDied","Data":"b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497798 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bxmbm" event={"ID":"d6889a88-68ef-4bf3-9e7e-78c6d84785ae","Type":"ContainerDied","Data":"8792842dcf47fbb891bdbacc6049db8d02de8a61256b9cb490a00a4e294f90ad"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497814 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39a4256e-db59-4623-aefb-bad0c1412bf1","Type":"ContainerDied","Data":"577b784734dd3e48c70fc10ff3124e3ae64cd95fd597512882e35ecf5b2ba94f"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.497825 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" event={"ID":"6a8356fc-f4bd-4853-a3f3-0d44ab20612b","Type":"ContainerDied","Data":"878aa4d7a336162ac22f34d19599bf58c73742f602e586a96453e05637fc3934"} Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.560588 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.603262 5025 scope.go:117] "RemoveContainer" containerID="ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.643957 5025 scope.go:117] "RemoveContainer" containerID="ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.645361 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-lmq8k"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.661599 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-lmq8k"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.673676 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-etc-swift\") pod \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.673731 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-public-tls-certs\") pod \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.673856 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-log-httpd\") pod \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.673963 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-internal-tls-certs\") pod \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.673987 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-run-httpd\") pod \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.674035 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2jcv\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-kube-api-access-r2jcv\") pod \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.674084 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-combined-ca-bundle\") pod \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.674130 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-config-data\") pod \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\" (UID: \"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.677245 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" (UID: "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.683043 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" (UID: "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.691602 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.697633 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.736033 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.737856 5025 scope.go:117] "RemoveContainer" containerID="ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382" Oct 07 08:38:16 crc kubenswrapper[5025]: E1007 08:38:16.738714 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382\": container with ID starting with ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382 not found: ID does not exist" containerID="ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.738747 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382"} err="failed to get container status \"ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382\": rpc error: code = NotFound desc = could not find container \"ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382\": container with ID starting with ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382 not found: ID does not exist" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.738779 5025 scope.go:117] "RemoveContainer" containerID="ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535" Oct 07 08:38:16 crc kubenswrapper[5025]: E1007 08:38:16.739387 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535\": container with ID starting with ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535 not found: ID does not exist" containerID="ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.739425 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535"} err="failed to get container status \"ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535\": rpc error: code = NotFound desc = could not find container \"ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535\": container with ID starting with ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535 not found: ID does not exist" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.739449 5025 scope.go:117] "RemoveContainer" containerID="ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.739682 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382"} err="failed to get container status \"ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382\": rpc error: code = NotFound desc = could not find container \"ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382\": container with ID starting with ed4525e997650f350f16b568eb27e0ffc3ffcbd1f1e7cb94fc621cd10eb9a382 not found: ID does not exist" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.739702 5025 scope.go:117] "RemoveContainer" containerID="ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.739863 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535"} err="failed to get container status \"ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535\": rpc error: code = NotFound desc = could not find container \"ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535\": container with ID starting with ec3a511c8f64d316ae9313f73bf8241c98c7dfe67b2e3fe3b4a4adad87d90535 not found: ID does not exist" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.739878 5025 scope.go:117] "RemoveContainer" containerID="e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.750446 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" (UID: "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.758051 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.772294 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-kube-api-access-r2jcv" (OuterVolumeSpecName: "kube-api-access-r2jcv") pod "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" (UID: "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb"). InnerVolumeSpecName "kube-api-access-r2jcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.777355 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bxmbm"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.777824 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.777862 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.777872 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2jcv\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-kube-api-access-r2jcv\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.777881 5025 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.783975 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bxmbm"] Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.791685 5025 scope.go:117] "RemoveContainer" containerID="bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.801687 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.881965 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-combined-ca-bundle\") pod \"8f770079-7ba4-4076-bd29-15fab31fc53d\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.882064 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xg6s\" (UniqueName: \"kubernetes.io/projected/8f770079-7ba4-4076-bd29-15fab31fc53d-kube-api-access-7xg6s\") pod \"8f770079-7ba4-4076-bd29-15fab31fc53d\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.882092 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-vencrypt-tls-certs\") pod \"8f770079-7ba4-4076-bd29-15fab31fc53d\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.882170 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-config-data\") pod \"8f770079-7ba4-4076-bd29-15fab31fc53d\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.882283 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-nova-novncproxy-tls-certs\") pod \"8f770079-7ba4-4076-bd29-15fab31fc53d\" (UID: \"8f770079-7ba4-4076-bd29-15fab31fc53d\") " Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.885718 5025 scope.go:117] "RemoveContainer" containerID="e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd" Oct 07 08:38:16 crc kubenswrapper[5025]: E1007 08:38:16.889874 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd\": container with ID starting with e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd not found: ID does not exist" containerID="e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.889998 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd"} err="failed to get container status \"e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd\": rpc error: code = NotFound desc = could not find container \"e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd\": container with ID starting with e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd not found: ID does not exist" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.890502 5025 scope.go:117] "RemoveContainer" containerID="bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8" Oct 07 08:38:16 crc kubenswrapper[5025]: E1007 08:38:16.894768 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8\": container with ID starting with bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8 not found: ID does not exist" containerID="bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.894808 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8"} err="failed to get container status \"bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8\": rpc error: code = NotFound desc = could not find container \"bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8\": container with ID starting with bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8 not found: ID does not exist" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.894836 5025 scope.go:117] "RemoveContainer" containerID="e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.895384 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd"} err="failed to get container status \"e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd\": rpc error: code = NotFound desc = could not find container \"e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd\": container with ID starting with e09b94886fdcccfceea04d4af844384d8eb419aec0bda3ba596e47528bb18abd not found: ID does not exist" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.895430 5025 scope.go:117] "RemoveContainer" containerID="bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.895858 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8"} err="failed to get container status \"bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8\": rpc error: code = NotFound desc = could not find container \"bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8\": container with ID starting with bc8d2241ccc0693c6d61ae5731ed3c9ec865ccbf9da2480484d4167078e69af8 not found: ID does not exist" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.895890 5025 scope.go:117] "RemoveContainer" containerID="628399cf9d75d695aa72a1837a9b4aa8d0c453aef85214b36b537198afb78e37" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.898720 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f770079-7ba4-4076-bd29-15fab31fc53d-kube-api-access-7xg6s" (OuterVolumeSpecName: "kube-api-access-7xg6s") pod "8f770079-7ba4-4076-bd29-15fab31fc53d" (UID: "8f770079-7ba4-4076-bd29-15fab31fc53d"). InnerVolumeSpecName "kube-api-access-7xg6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.949839 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" (UID: "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.952722 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f770079-7ba4-4076-bd29-15fab31fc53d" (UID: "8f770079-7ba4-4076-bd29-15fab31fc53d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.973260 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "8f770079-7ba4-4076-bd29-15fab31fc53d" (UID: "8f770079-7ba4-4076-bd29-15fab31fc53d"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.979781 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-config-data" (OuterVolumeSpecName: "config-data") pod "8f770079-7ba4-4076-bd29-15fab31fc53d" (UID: "8f770079-7ba4-4076-bd29-15fab31fc53d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.981726 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "8f770079-7ba4-4076-bd29-15fab31fc53d" (UID: "8f770079-7ba4-4076-bd29-15fab31fc53d"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.985196 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xg6s\" (UniqueName: \"kubernetes.io/projected/8f770079-7ba4-4076-bd29-15fab31fc53d-kube-api-access-7xg6s\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.985229 5025 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.985242 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.985253 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.985265 5025 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.985275 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f770079-7ba4-4076-bd29-15fab31fc53d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:16 crc kubenswrapper[5025]: E1007 08:38:16.985343 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 08:38:16 crc kubenswrapper[5025]: E1007 08:38:16.985392 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data podName:d46577dd-b38b-4b80-ad57-577629e648b8 nodeName:}" failed. No retries permitted until 2025-10-07 08:38:20.985375952 +0000 UTC m=+1307.794690096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data") pod "rabbitmq-server-0" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8") : configmap "rabbitmq-config-data" not found Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.990910 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-config-data" (OuterVolumeSpecName: "config-data") pod "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" (UID: "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:16 crc kubenswrapper[5025]: I1007 08:38:16.991987 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" (UID: "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.006697 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" (UID: "5a34d85b-3d52-4c22-bb89-ca22d9f45eeb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.024328 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.025787 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.027012 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.027042 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerName="ovn-northd" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.087644 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.087673 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.087743 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.241020 5025 generic.go:334] "Generic (PLEG): container finished" podID="5db22b1d-c7bd-4bc9-b30b-3271da7741e1" containerID="fe88f33ff0949c6497d1f403bf9b543b6f9b6a8feba3d54578e33f486413a22f" exitCode=0 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.241518 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5db22b1d-c7bd-4bc9-b30b-3271da7741e1","Type":"ContainerDied","Data":"fe88f33ff0949c6497d1f403bf9b543b6f9b6a8feba3d54578e33f486413a22f"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.244256 5025 generic.go:334] "Generic (PLEG): container finished" podID="9740b484-8952-4253-9158-17164236ffcc" containerID="3dea8fbfe81c9050731ad9beb8566abea1bfd95a8e42f4666c9d09f77e5b122b" exitCode=0 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.244795 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5528-account-delete-wnxjq" event={"ID":"9740b484-8952-4253-9158-17164236ffcc","Type":"ContainerDied","Data":"3dea8fbfe81c9050731ad9beb8566abea1bfd95a8e42f4666c9d09f77e5b122b"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.244852 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5528-account-delete-wnxjq" event={"ID":"9740b484-8952-4253-9158-17164236ffcc","Type":"ContainerStarted","Data":"480767b0f2dd3058d0ede817291740e21cec62ecb058c5992a5fcb16740f255e"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.252753 5025 generic.go:334] "Generic (PLEG): container finished" podID="b956a819-1c8f-471c-a738-fe4d78fbbb98" containerID="24e26a1c9452261b7222faac9bd3c8759890a7e885ed4f0a26471b46a6648ae1" exitCode=0 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.252823 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0ee4-account-delete-8cznp" event={"ID":"b956a819-1c8f-471c-a738-fe4d78fbbb98","Type":"ContainerDied","Data":"24e26a1c9452261b7222faac9bd3c8759890a7e885ed4f0a26471b46a6648ae1"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.252847 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0ee4-account-delete-8cznp" event={"ID":"b956a819-1c8f-471c-a738-fe4d78fbbb98","Type":"ContainerStarted","Data":"bb37185751b3f42fb7a211d088019b1a6334cba73383756d1ddfae1ff2f6b20d"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.259669 5025 generic.go:334] "Generic (PLEG): container finished" podID="8f770079-7ba4-4076-bd29-15fab31fc53d" containerID="c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886" exitCode=0 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.259732 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f770079-7ba4-4076-bd29-15fab31fc53d","Type":"ContainerDied","Data":"c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.259760 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f770079-7ba4-4076-bd29-15fab31fc53d","Type":"ContainerDied","Data":"c3c68c6543e6491e76f0bbb26f233e8307132662027970ec301e4a78036bcbab"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.259830 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.267057 5025 generic.go:334] "Generic (PLEG): container finished" podID="041675fd-b2c1-4e9c-9b05-4a2aef6d329f" containerID="d89c8dd7b87c42dab76a0b14969e3fb390ccaaeddad4b9fda9391555044db71a" exitCode=1 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.267121 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell150ed-account-delete-mqxng" event={"ID":"041675fd-b2c1-4e9c-9b05-4a2aef6d329f","Type":"ContainerDied","Data":"d89c8dd7b87c42dab76a0b14969e3fb390ccaaeddad4b9fda9391555044db71a"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.267148 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell150ed-account-delete-mqxng" event={"ID":"041675fd-b2c1-4e9c-9b05-4a2aef6d329f","Type":"ContainerStarted","Data":"ce5882e38e3f83b4ffc5ad5ca3adfe9a87c6388a9986a1b8a615968c0b17f4e2"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.272608 5025 generic.go:334] "Generic (PLEG): container finished" podID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerID="fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559" exitCode=0 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.272760 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" event={"ID":"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb","Type":"ContainerDied","Data":"fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.272793 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" event={"ID":"5a34d85b-3d52-4c22-bb89-ca22d9f45eeb","Type":"ContainerDied","Data":"4e7e371efe7968cef3dd9d8ac0627d937b6998390fd725596e1416431a9c21d9"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.272947 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c4bb756cf-6b9fc" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.275641 5025 generic.go:334] "Generic (PLEG): container finished" podID="c87571c0-a1b9-4187-a60c-dd66d9fbb301" containerID="399bca0216396656c867a3f5730b1d3a96e7ba8bd1d46dd81ca0fc9e3d104c68" exitCode=0 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.275704 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderb478-account-delete-p6rj5" event={"ID":"c87571c0-a1b9-4187-a60c-dd66d9fbb301","Type":"ContainerDied","Data":"399bca0216396656c867a3f5730b1d3a96e7ba8bd1d46dd81ca0fc9e3d104c68"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.279264 5025 generic.go:334] "Generic (PLEG): container finished" podID="582efaa6-0e81-462a-813f-7ba1cf9c6fc2" containerID="681d0f55b346cbe735938996d113e2740ed0c5031feeac188e0b54605836e911" exitCode=0 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.279315 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"582efaa6-0e81-462a-813f-7ba1cf9c6fc2","Type":"ContainerDied","Data":"681d0f55b346cbe735938996d113e2740ed0c5031feeac188e0b54605836e911"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.293645 5025 generic.go:334] "Generic (PLEG): container finished" podID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerID="37a99151a2993186ee570500792634aa31edd00f179d56fe2993e992303f092f" exitCode=0 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.294038 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fc34125-4736-4467-bf0e-ec3b211e6d13","Type":"ContainerDied","Data":"37a99151a2993186ee570500792634aa31edd00f179d56fe2993e992303f092f"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.294091 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fc34125-4736-4467-bf0e-ec3b211e6d13","Type":"ContainerDied","Data":"1a42cfeb52c0ed9962d03e8cd19d8def09a7c1dd9afe1814618bde979fabfcef"} Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.294108 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a42cfeb52c0ed9962d03e8cd19d8def09a7c1dd9afe1814618bde979fabfcef" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.353913 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.357600 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="ceilometer-central-agent" containerID="cri-o://0d4721c409ed563195fffda2f9874618fd3f3cc7746c05dc5decaf962347f14b" gracePeriod=30 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.357752 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="proxy-httpd" containerID="cri-o://a1a35e09f930c853a6483e0e5e192b2b3f584932be09beeb5bd496d456b573a1" gracePeriod=30 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.357788 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="sg-core" containerID="cri-o://7b52de27036b56d8d85f64dd500bd9547ec47d9718e92cc57aa2c682ce41fe24" gracePeriod=30 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.357821 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="ceilometer-notification-agent" containerID="cri-o://a4937f814a2ec01bb17a146ab80071941530367ba22e6ebb3a6adb3bf515dd0a" gracePeriod=30 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.384754 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.384974 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="732b0bbc-f02e-4ced-9230-af95c9654b93" containerName="kube-state-metrics" containerID="cri-o://48e5ab1102a8da55abb1bed760339663a999e777637527039fd43637c874b375" gracePeriod=30 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.477908 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.506206 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.506208 5025 scope.go:117] "RemoveContainer" containerID="39701f3461ec57f87ded44047ecffc35edf679b1b33f29a9dedea8b73829426d" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.533636 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c4bb756cf-6b9fc"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.559610 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6c4bb756cf-6b9fc"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.592793 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.593228 5025 scope.go:117] "RemoveContainer" containerID="c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.598957 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27dfr\" (UniqueName: \"kubernetes.io/projected/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-kube-api-access-27dfr\") pod \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.599010 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data\") pod \"9fc34125-4736-4467-bf0e-ec3b211e6d13\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.599092 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data-custom\") pod \"9fc34125-4736-4467-bf0e-ec3b211e6d13\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.599160 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-combined-ca-bundle\") pod \"9fc34125-4736-4467-bf0e-ec3b211e6d13\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.599191 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmxnh\" (UniqueName: \"kubernetes.io/projected/9fc34125-4736-4467-bf0e-ec3b211e6d13-kube-api-access-cmxnh\") pod \"9fc34125-4736-4467-bf0e-ec3b211e6d13\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.599292 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fc34125-4736-4467-bf0e-ec3b211e6d13-etc-machine-id\") pod \"9fc34125-4736-4467-bf0e-ec3b211e6d13\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.599329 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-config-data\") pod \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.599355 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-combined-ca-bundle\") pod \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\" (UID: \"5db22b1d-c7bd-4bc9-b30b-3271da7741e1\") " Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.599374 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-scripts\") pod \"9fc34125-4736-4467-bf0e-ec3b211e6d13\" (UID: \"9fc34125-4736-4467-bf0e-ec3b211e6d13\") " Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.605678 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9fc34125-4736-4467-bf0e-ec3b211e6d13-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9fc34125-4736-4467-bf0e-ec3b211e6d13" (UID: "9fc34125-4736-4467-bf0e-ec3b211e6d13"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.606956 5025 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fc34125-4736-4467-bf0e-ec3b211e6d13-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.611620 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc34125-4736-4467-bf0e-ec3b211e6d13-kube-api-access-cmxnh" (OuterVolumeSpecName: "kube-api-access-cmxnh") pod "9fc34125-4736-4467-bf0e-ec3b211e6d13" (UID: "9fc34125-4736-4467-bf0e-ec3b211e6d13"). InnerVolumeSpecName "kube-api-access-cmxnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.630435 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.630973 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-scripts" (OuterVolumeSpecName: "scripts") pod "9fc34125-4736-4467-bf0e-ec3b211e6d13" (UID: "9fc34125-4736-4467-bf0e-ec3b211e6d13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.669807 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-kube-api-access-27dfr" (OuterVolumeSpecName: "kube-api-access-27dfr") pod "5db22b1d-c7bd-4bc9-b30b-3271da7741e1" (UID: "5db22b1d-c7bd-4bc9-b30b-3271da7741e1"). InnerVolumeSpecName "kube-api-access-27dfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.669925 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9fc34125-4736-4467-bf0e-ec3b211e6d13" (UID: "9fc34125-4736-4467-bf0e-ec3b211e6d13"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.685526 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.685785 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="19d1598f-6725-40cd-99bd-5ee1bf699225" containerName="memcached" containerID="cri-o://7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1" gracePeriod=30 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.711624 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.711651 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmxnh\" (UniqueName: \"kubernetes.io/projected/9fc34125-4736-4467-bf0e-ec3b211e6d13-kube-api-access-cmxnh\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.711660 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.711668 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27dfr\" (UniqueName: \"kubernetes.io/projected/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-kube-api-access-27dfr\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.723803 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5db22b1d-c7bd-4bc9-b30b-3271da7741e1" (UID: "5db22b1d-c7bd-4bc9-b30b-3271da7741e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.738383 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wxjnm"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.742774 5025 scope.go:117] "RemoveContainer" containerID="c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.751974 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886\": container with ID starting with c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886 not found: ID does not exist" containerID="c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.752028 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886"} err="failed to get container status \"c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886\": rpc error: code = NotFound desc = could not find container \"c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886\": container with ID starting with c224c26e70e8b206863d04bee8de16f8a66f613a49bcfb44926fbe08bbd97886 not found: ID does not exist" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.752062 5025 scope.go:117] "RemoveContainer" containerID="fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.753767 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wxjnm"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.767581 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-config-data" (OuterVolumeSpecName: "config-data") pod "5db22b1d-c7bd-4bc9-b30b-3271da7741e1" (UID: "5db22b1d-c7bd-4bc9-b30b-3271da7741e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.769842 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fc34125-4736-4467-bf0e-ec3b211e6d13" (UID: "9fc34125-4736-4467-bf0e-ec3b211e6d13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.771611 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vz2wg"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.771617 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": read tcp 10.217.0.2:40098->10.217.0.166:8776: read: connection reset by peer" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.783945 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vz2wg"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.786346 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6f574885d6-269v8"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.786596 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6f574885d6-269v8" podUID="a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" containerName="keystone-api" containerID="cri-o://765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c" gracePeriod=30 Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.802777 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone1b46-account-delete-sszts"] Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803204 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f76b8e5-9257-4a0f-8067-ac36ccbe6711" containerName="openstack-network-exporter" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803222 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f76b8e5-9257-4a0f-8067-ac36ccbe6711" containerName="openstack-network-exporter" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803242 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219d3031-42b0-40df-98fc-46fe548a8f39" containerName="init" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803248 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="219d3031-42b0-40df-98fc-46fe548a8f39" containerName="init" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803255 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerName="proxy-server" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803261 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerName="proxy-server" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803270 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerName="ovsdbserver-nb" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803277 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerName="ovsdbserver-nb" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803292 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6889a88-68ef-4bf3-9e7e-78c6d84785ae" containerName="ovn-controller" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803299 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6889a88-68ef-4bf3-9e7e-78c6d84785ae" containerName="ovn-controller" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803310 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ac1818-6553-400f-91be-19b032aae626" containerName="openstack-network-exporter" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803317 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ac1818-6553-400f-91be-19b032aae626" containerName="openstack-network-exporter" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803326 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219d3031-42b0-40df-98fc-46fe548a8f39" containerName="dnsmasq-dns" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803332 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="219d3031-42b0-40df-98fc-46fe548a8f39" containerName="dnsmasq-dns" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803339 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerName="probe" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803345 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerName="probe" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803357 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerName="openstack-network-exporter" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803364 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerName="openstack-network-exporter" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803373 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerName="proxy-httpd" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803378 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerName="proxy-httpd" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803388 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ac1818-6553-400f-91be-19b032aae626" containerName="ovsdbserver-sb" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803395 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ac1818-6553-400f-91be-19b032aae626" containerName="ovsdbserver-sb" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803410 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerName="cinder-scheduler" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803416 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerName="cinder-scheduler" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803423 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db22b1d-c7bd-4bc9-b30b-3271da7741e1" containerName="nova-cell1-conductor-conductor" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803429 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db22b1d-c7bd-4bc9-b30b-3271da7741e1" containerName="nova-cell1-conductor-conductor" Oct 07 08:38:17 crc kubenswrapper[5025]: E1007 08:38:17.803439 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f770079-7ba4-4076-bd29-15fab31fc53d" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803447 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f770079-7ba4-4076-bd29-15fab31fc53d" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803646 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f770079-7ba4-4076-bd29-15fab31fc53d" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803662 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ac1818-6553-400f-91be-19b032aae626" containerName="ovsdbserver-sb" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803670 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerName="openstack-network-exporter" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803681 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="219d3031-42b0-40df-98fc-46fe548a8f39" containerName="dnsmasq-dns" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803695 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerName="cinder-scheduler" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803704 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ac1818-6553-400f-91be-19b032aae626" containerName="openstack-network-exporter" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803713 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc34125-4736-4467-bf0e-ec3b211e6d13" containerName="probe" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803727 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f76b8e5-9257-4a0f-8067-ac36ccbe6711" containerName="openstack-network-exporter" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803738 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerName="proxy-httpd" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803744 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db22b1d-c7bd-4bc9-b30b-3271da7741e1" containerName="nova-cell1-conductor-conductor" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803753 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" containerName="ovsdbserver-nb" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803763 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" containerName="proxy-server" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.803774 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6889a88-68ef-4bf3-9e7e-78c6d84785ae" containerName="ovn-controller" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.804302 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.804646 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone1b46-account-delete-sszts" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.811974 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone1b46-account-delete-sszts"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.813467 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.813484 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db22b1d-c7bd-4bc9-b30b-3271da7741e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.813494 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.819855 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lcsnf"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.833891 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lcsnf"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.838308 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b478-account-create-pg87g"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.842996 5025 scope.go:117] "RemoveContainer" containerID="e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.852529 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderb478-account-delete-p6rj5"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.853338 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b478-account-create-pg87g"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.859010 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gzznk"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.864315 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data" (OuterVolumeSpecName: "config-data") pod "9fc34125-4736-4467-bf0e-ec3b211e6d13" (UID: "9fc34125-4736-4467-bf0e-ec3b211e6d13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.865323 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gzznk"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.880992 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone1b46-account-delete-sszts"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.894353 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1b46-account-create-lgjrd"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.901136 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1b46-account-create-lgjrd"] Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.920708 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2k27\" (UniqueName: \"kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27\") pod \"keystone1b46-account-delete-sszts\" (UID: \"4211d4e2-eae9-4754-9bd1-42197c704bc6\") " pod="openstack/keystone1b46-account-delete-sszts" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.940316 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc34125-4736-4467-bf0e-ec3b211e6d13-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.961807 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13598166-30eb-43b2-8a13-2e2ca72f58e9" path="/var/lib/kubelet/pods/13598166-30eb-43b2-8a13-2e2ca72f58e9/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.965339 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219d3031-42b0-40df-98fc-46fe548a8f39" path="/var/lib/kubelet/pods/219d3031-42b0-40df-98fc-46fe548a8f39/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.967297 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c6fe82-d552-450a-845b-8c5da8bf5423" path="/var/lib/kubelet/pods/40c6fe82-d552-450a-845b-8c5da8bf5423/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.968248 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a34d85b-3d52-4c22-bb89-ca22d9f45eeb" path="/var/lib/kubelet/pods/5a34d85b-3d52-4c22-bb89-ca22d9f45eeb/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.976993 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e01de5b-7e9d-4f70-91d9-d55165a8eb32" path="/var/lib/kubelet/pods/8e01de5b-7e9d-4f70-91d9-d55165a8eb32/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.985475 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f770079-7ba4-4076-bd29-15fab31fc53d" path="/var/lib/kubelet/pods/8f770079-7ba4-4076-bd29-15fab31fc53d/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.986220 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a69558c-23d2-4755-95a7-0497a3baec0c" path="/var/lib/kubelet/pods/9a69558c-23d2-4755-95a7-0497a3baec0c/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.986765 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c065f6f6-5301-4933-8794-606f5b61464e" path="/var/lib/kubelet/pods/c065f6f6-5301-4933-8794-606f5b61464e/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.987595 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9" path="/var/lib/kubelet/pods/d4b8747a-af4d-42a8-8caf-5ea8b7ff06d9/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.988663 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6889a88-68ef-4bf3-9e7e-78c6d84785ae" path="/var/lib/kubelet/pods/d6889a88-68ef-4bf3-9e7e-78c6d84785ae/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.992496 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e38deef2-87ec-483e-84dd-e1ef8822202b" path="/var/lib/kubelet/pods/e38deef2-87ec-483e-84dd-e1ef8822202b/volumes" Oct 07 08:38:17 crc kubenswrapper[5025]: I1007 08:38:17.994427 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ac1818-6553-400f-91be-19b032aae626" path="/var/lib/kubelet/pods/e8ac1818-6553-400f-91be-19b032aae626/volumes" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.042694 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2k27\" (UniqueName: \"kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27\") pod \"keystone1b46-account-delete-sszts\" (UID: \"4211d4e2-eae9-4754-9bd1-42197c704bc6\") " pod="openstack/keystone1b46-account-delete-sszts" Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.051258 5025 projected.go:194] Error preparing data for projected volume kube-api-access-f2k27 for pod openstack/keystone1b46-account-delete-sszts: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.051329 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27 podName:4211d4e2-eae9-4754-9bd1-42197c704bc6 nodeName:}" failed. No retries permitted until 2025-10-07 08:38:18.551309696 +0000 UTC m=+1305.360623840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f2k27" (UniqueName: "kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27") pod "keystone1b46-account-delete-sszts" (UID: "4211d4e2-eae9-4754-9bd1-42197c704bc6") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.091793 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" containerName="galera" containerID="cri-o://7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8" gracePeriod=30 Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.321756 5025 scope.go:117] "RemoveContainer" containerID="fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559" Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.322378 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559\": container with ID starting with fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559 not found: ID does not exist" containerID="fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.322431 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559"} err="failed to get container status \"fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559\": rpc error: code = NotFound desc = could not find container \"fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559\": container with ID starting with fea511675388812860a58e922460e16718d212bb001ceaba429f1ddc5d7e5559 not found: ID does not exist" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.322464 5025 scope.go:117] "RemoveContainer" containerID="e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e" Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.323690 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e\": container with ID starting with e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e not found: ID does not exist" containerID="e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.323721 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e"} err="failed to get container status \"e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e\": rpc error: code = NotFound desc = could not find container \"e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e\": container with ID starting with e424505ab43fcd0b5722f275f78973479bdce125b1c35d3c687a07c264ff792e not found: ID does not exist" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.324159 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell150ed-account-delete-mqxng" event={"ID":"041675fd-b2c1-4e9c-9b05-4a2aef6d329f","Type":"ContainerDied","Data":"ce5882e38e3f83b4ffc5ad5ca3adfe9a87c6388a9986a1b8a615968c0b17f4e2"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.324182 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce5882e38e3f83b4ffc5ad5ca3adfe9a87c6388a9986a1b8a615968c0b17f4e2" Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.328822 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-f2k27], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone1b46-account-delete-sszts" podUID="4211d4e2-eae9-4754-9bd1-42197c704bc6" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.330500 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.330997 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.333366 5025 generic.go:334] "Generic (PLEG): container finished" podID="732b0bbc-f02e-4ced-9230-af95c9654b93" containerID="48e5ab1102a8da55abb1bed760339663a999e777637527039fd43637c874b375" exitCode=2 Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.333450 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"732b0bbc-f02e-4ced-9230-af95c9654b93","Type":"ContainerDied","Data":"48e5ab1102a8da55abb1bed760339663a999e777637527039fd43637c874b375"} Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.333596 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.338137 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell150ed-account-delete-mqxng" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.338281 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5528-account-delete-wnxjq" event={"ID":"9740b484-8952-4253-9158-17164236ffcc","Type":"ContainerDied","Data":"480767b0f2dd3058d0ede817291740e21cec62ecb058c5992a5fcb16740f255e"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.338302 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="480767b0f2dd3058d0ede817291740e21cec62ecb058c5992a5fcb16740f255e" Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.341830 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.341905 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" containerName="galera" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.346314 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4544-account-delete-qm7lc" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.347426 5025 generic.go:334] "Generic (PLEG): container finished" podID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerID="4055606214f6b158a47c0d68c52aea2b1ad07643ccaa8ad47dd496cc1fca1a1e" exitCode=0 Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.347568 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"988f0f87-7a01-4aac-a874-8e885b2f83cb","Type":"ContainerDied","Data":"4055606214f6b158a47c0d68c52aea2b1ad07643ccaa8ad47dd496cc1fca1a1e"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.363049 5025 generic.go:334] "Generic (PLEG): container finished" podID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerID="40bc944b02028fd2ab780bbc8f7785ef5771a301ce586feafccde56e9109c707" exitCode=0 Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.363098 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39a4256e-db59-4623-aefb-bad0c1412bf1","Type":"ContainerDied","Data":"40bc944b02028fd2ab780bbc8f7785ef5771a301ce586feafccde56e9109c707"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.369938 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0ee4-account-delete-8cznp" event={"ID":"b956a819-1c8f-471c-a738-fe4d78fbbb98","Type":"ContainerDied","Data":"bb37185751b3f42fb7a211d088019b1a6334cba73383756d1ddfae1ff2f6b20d"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.371372 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb37185751b3f42fb7a211d088019b1a6334cba73383756d1ddfae1ff2f6b20d" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.427714 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": read tcp 10.217.0.2:32942->10.217.0.205:8775: read: connection reset by peer" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.428210 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": read tcp 10.217.0.2:32958->10.217.0.205:8775: read: connection reset by peer" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.429516 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5db22b1d-c7bd-4bc9-b30b-3271da7741e1","Type":"ContainerDied","Data":"c0282852894896a47589fb5aca382e189ee07ceaa5e091b0655bf7af1b48f04c"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.429585 5025 scope.go:117] "RemoveContainer" containerID="fe88f33ff0949c6497d1f403bf9b543b6f9b6a8feba3d54578e33f486413a22f" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.429811 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.435270 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement5528-account-delete-wnxjq" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.436586 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerDied","Data":"a1a35e09f930c853a6483e0e5e192b2b3f584932be09beeb5bd496d456b573a1"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.436594 5025 generic.go:334] "Generic (PLEG): container finished" podID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerID="a1a35e09f930c853a6483e0e5e192b2b3f584932be09beeb5bd496d456b573a1" exitCode=0 Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.436633 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerDied","Data":"7b52de27036b56d8d85f64dd500bd9547ec47d9718e92cc57aa2c682ce41fe24"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.436645 5025 generic.go:334] "Generic (PLEG): container finished" podID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerID="7b52de27036b56d8d85f64dd500bd9547ec47d9718e92cc57aa2c682ce41fe24" exitCode=2 Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.436679 5025 generic.go:334] "Generic (PLEG): container finished" podID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerID="0d4721c409ed563195fffda2f9874618fd3f3cc7746c05dc5decaf962347f14b" exitCode=0 Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.436737 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerDied","Data":"0d4721c409ed563195fffda2f9874618fd3f3cc7746c05dc5decaf962347f14b"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.443479 5025 generic.go:334] "Generic (PLEG): container finished" podID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerID="50b0aff959782d5056be05dce7539fc1d1b6491f125fcc23557b5dc2b17bcb0a" exitCode=0 Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.443566 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78","Type":"ContainerDied","Data":"50b0aff959782d5056be05dce7539fc1d1b6491f125fcc23557b5dc2b17bcb0a"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.446886 5025 generic.go:334] "Generic (PLEG): container finished" podID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerID="91ba013dc33b4d2f92059c4d4c66f039c2cb58683016a2882a32f59cc78bb607" exitCode=0 Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.446935 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-684bd87b6d-w58z5" event={"ID":"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8","Type":"ContainerDied","Data":"91ba013dc33b4d2f92059c4d4c66f039c2cb58683016a2882a32f59cc78bb607"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.453160 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzndz\" (UniqueName: \"kubernetes.io/projected/041675fd-b2c1-4e9c-9b05-4a2aef6d329f-kube-api-access-pzndz\") pod \"041675fd-b2c1-4e9c-9b05-4a2aef6d329f\" (UID: \"041675fd-b2c1-4e9c-9b05-4a2aef6d329f\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.453250 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-default\") pod \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.453292 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scnsr\" (UniqueName: \"kubernetes.io/projected/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kube-api-access-scnsr\") pod \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.453333 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4544-account-delete-qm7lc" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.453334 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-secrets\") pod \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454185 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4544-account-delete-qm7lc" event={"ID":"28643c48-ee4c-4880-8901-9e3231c70b70","Type":"ContainerDied","Data":"2ec633ad29ddbd1247f9ed3f3272e3565e6ac3ce0a7cc1b0057d1aa17070054b"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454253 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-generated\") pod \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454358 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kolla-config\") pod \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454457 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454490 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsl9p\" (UniqueName: \"kubernetes.io/projected/28643c48-ee4c-4880-8901-9e3231c70b70-kube-api-access-zsl9p\") pod \"28643c48-ee4c-4880-8901-9e3231c70b70\" (UID: \"28643c48-ee4c-4880-8901-9e3231c70b70\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454561 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bjhn\" (UniqueName: \"kubernetes.io/projected/9740b484-8952-4253-9158-17164236ffcc-kube-api-access-6bjhn\") pod \"9740b484-8952-4253-9158-17164236ffcc\" (UID: \"9740b484-8952-4253-9158-17164236ffcc\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454641 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-galera-tls-certs\") pod \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454684 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-operator-scripts\") pod \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454733 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-combined-ca-bundle\") pod \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\" (UID: \"582efaa6-0e81-462a-813f-7ba1cf9c6fc2\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.454835 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "582efaa6-0e81-462a-813f-7ba1cf9c6fc2" (UID: "582efaa6-0e81-462a-813f-7ba1cf9c6fc2"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.455293 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "582efaa6-0e81-462a-813f-7ba1cf9c6fc2" (UID: "582efaa6-0e81-462a-813f-7ba1cf9c6fc2"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.456041 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "582efaa6-0e81-462a-813f-7ba1cf9c6fc2" (UID: "582efaa6-0e81-462a-813f-7ba1cf9c6fc2"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.459606 5025 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.459628 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.459638 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.459815 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "582efaa6-0e81-462a-813f-7ba1cf9c6fc2" (UID: "582efaa6-0e81-462a-813f-7ba1cf9c6fc2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.460037 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kube-api-access-scnsr" (OuterVolumeSpecName: "kube-api-access-scnsr") pod "582efaa6-0e81-462a-813f-7ba1cf9c6fc2" (UID: "582efaa6-0e81-462a-813f-7ba1cf9c6fc2"). InnerVolumeSpecName "kube-api-access-scnsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.461108 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0ee4-account-delete-8cznp" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.463126 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28643c48-ee4c-4880-8901-9e3231c70b70-kube-api-access-zsl9p" (OuterVolumeSpecName: "kube-api-access-zsl9p") pod "28643c48-ee4c-4880-8901-9e3231c70b70" (UID: "28643c48-ee4c-4880-8901-9e3231c70b70"). InnerVolumeSpecName "kube-api-access-zsl9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.468394 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/041675fd-b2c1-4e9c-9b05-4a2aef6d329f-kube-api-access-pzndz" (OuterVolumeSpecName: "kube-api-access-pzndz") pod "041675fd-b2c1-4e9c-9b05-4a2aef6d329f" (UID: "041675fd-b2c1-4e9c-9b05-4a2aef6d329f"). InnerVolumeSpecName "kube-api-access-pzndz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.477852 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-secrets" (OuterVolumeSpecName: "secrets") pod "582efaa6-0e81-462a-813f-7ba1cf9c6fc2" (UID: "582efaa6-0e81-462a-813f-7ba1cf9c6fc2"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.494742 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "582efaa6-0e81-462a-813f-7ba1cf9c6fc2" (UID: "582efaa6-0e81-462a-813f-7ba1cf9c6fc2"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.498003 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9740b484-8952-4253-9158-17164236ffcc-kube-api-access-6bjhn" (OuterVolumeSpecName: "kube-api-access-6bjhn") pod "9740b484-8952-4253-9158-17164236ffcc" (UID: "9740b484-8952-4253-9158-17164236ffcc"). InnerVolumeSpecName "kube-api-access-6bjhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.498025 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.498086 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"582efaa6-0e81-462a-813f-7ba1cf9c6fc2","Type":"ContainerDied","Data":"c1f90c6746a1f69ff4d80247adf14f633f0bcf5b531a2c02f27a420ab2cd3136"} Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.498709 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.509059 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "582efaa6-0e81-462a-813f-7ba1cf9c6fc2" (UID: "582efaa6-0e81-462a-813f-7ba1cf9c6fc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.539769 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "582efaa6-0e81-462a-813f-7ba1cf9c6fc2" (UID: "582efaa6-0e81-462a-813f-7ba1cf9c6fc2"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.560965 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjh7m\" (UniqueName: \"kubernetes.io/projected/b956a819-1c8f-471c-a738-fe4d78fbbb98-kube-api-access-gjh7m\") pod \"b956a819-1c8f-471c-a738-fe4d78fbbb98\" (UID: \"b956a819-1c8f-471c-a738-fe4d78fbbb98\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561238 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2k27\" (UniqueName: \"kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27\") pod \"keystone1b46-account-delete-sszts\" (UID: \"4211d4e2-eae9-4754-9bd1-42197c704bc6\") " pod="openstack/keystone1b46-account-delete-sszts" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561374 5025 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561387 5025 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561396 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561405 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzndz\" (UniqueName: \"kubernetes.io/projected/041675fd-b2c1-4e9c-9b05-4a2aef6d329f-kube-api-access-pzndz\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561414 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scnsr\" (UniqueName: \"kubernetes.io/projected/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-kube-api-access-scnsr\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561422 5025 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/582efaa6-0e81-462a-813f-7ba1cf9c6fc2-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561440 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561449 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsl9p\" (UniqueName: \"kubernetes.io/projected/28643c48-ee4c-4880-8901-9e3231c70b70-kube-api-access-zsl9p\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.561458 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bjhn\" (UniqueName: \"kubernetes.io/projected/9740b484-8952-4253-9158-17164236ffcc-kube-api-access-6bjhn\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.566283 5025 projected.go:194] Error preparing data for projected volume kube-api-access-f2k27 for pod openstack/keystone1b46-account-delete-sszts: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 08:38:18 crc kubenswrapper[5025]: E1007 08:38:18.566357 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27 podName:4211d4e2-eae9-4754-9bd1-42197c704bc6 nodeName:}" failed. No retries permitted until 2025-10-07 08:38:19.566336048 +0000 UTC m=+1306.375650192 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f2k27" (UniqueName: "kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27") pod "keystone1b46-account-delete-sszts" (UID: "4211d4e2-eae9-4754-9bd1-42197c704bc6") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.566977 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b956a819-1c8f-471c-a738-fe4d78fbbb98-kube-api-access-gjh7m" (OuterVolumeSpecName: "kube-api-access-gjh7m") pod "b956a819-1c8f-471c-a738-fe4d78fbbb98" (UID: "b956a819-1c8f-471c-a738-fe4d78fbbb98"). InnerVolumeSpecName "kube-api-access-gjh7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.567058 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.584876 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.591206 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.597285 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.599001 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.621394 5025 scope.go:117] "RemoveContainer" containerID="be990c38f8204943f8049baf27de19af16319b683aba310036475c95b583ac65" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.624985 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.664067 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.664797 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-combined-ca-bundle\") pod \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.664843 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-certs\") pod \"732b0bbc-f02e-4ced-9230-af95c9654b93\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.664884 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data-custom\") pod \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.664915 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skp6n\" (UniqueName: \"kubernetes.io/projected/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-api-access-skp6n\") pod \"732b0bbc-f02e-4ced-9230-af95c9654b93\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.664941 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-internal-tls-certs\") pod \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.664969 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data\") pod \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.665137 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-public-tls-certs\") pod \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.665167 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc9mg\" (UniqueName: \"kubernetes.io/projected/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-kube-api-access-cc9mg\") pod \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.665194 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-config\") pod \"732b0bbc-f02e-4ced-9230-af95c9654b93\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.665241 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-etc-machine-id\") pod \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.665263 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-logs\") pod \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.665286 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-combined-ca-bundle\") pod \"732b0bbc-f02e-4ced-9230-af95c9654b93\" (UID: \"732b0bbc-f02e-4ced-9230-af95c9654b93\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.665322 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-scripts\") pod \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\" (UID: \"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.665753 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.665775 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjh7m\" (UniqueName: \"kubernetes.io/projected/b956a819-1c8f-471c-a738-fe4d78fbbb98-kube-api-access-gjh7m\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.674769 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-scripts" (OuterVolumeSpecName: "scripts") pod "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" (UID: "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.674844 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" (UID: "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.679653 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-logs" (OuterVolumeSpecName: "logs") pod "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" (UID: "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.681129 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.685795 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-kube-api-access-cc9mg" (OuterVolumeSpecName: "kube-api-access-cc9mg") pod "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" (UID: "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78"). InnerVolumeSpecName "kube-api-access-cc9mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.710268 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-api-access-skp6n" (OuterVolumeSpecName: "kube-api-access-skp6n") pod "732b0bbc-f02e-4ced-9230-af95c9654b93" (UID: "732b0bbc-f02e-4ced-9230-af95c9654b93"). InnerVolumeSpecName "kube-api-access-skp6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.726737 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" (UID: "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.738281 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "732b0bbc-f02e-4ced-9230-af95c9654b93" (UID: "732b0bbc-f02e-4ced-9230-af95c9654b93"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.758614 5025 scope.go:117] "RemoveContainer" containerID="681d0f55b346cbe735938996d113e2740ed0c5031feeac188e0b54605836e911" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.766513 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-httpd-run\") pod \"988f0f87-7a01-4aac-a874-8e885b2f83cb\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.766580 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-logs\") pod \"988f0f87-7a01-4aac-a874-8e885b2f83cb\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.766606 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prpb9\" (UniqueName: \"kubernetes.io/projected/988f0f87-7a01-4aac-a874-8e885b2f83cb-kube-api-access-prpb9\") pod \"988f0f87-7a01-4aac-a874-8e885b2f83cb\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.766644 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-config-data\") pod \"988f0f87-7a01-4aac-a874-8e885b2f83cb\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.766689 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"988f0f87-7a01-4aac-a874-8e885b2f83cb\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.766725 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-scripts\") pod \"988f0f87-7a01-4aac-a874-8e885b2f83cb\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.766807 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-internal-tls-certs\") pod \"988f0f87-7a01-4aac-a874-8e885b2f83cb\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.766911 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-combined-ca-bundle\") pod \"988f0f87-7a01-4aac-a874-8e885b2f83cb\" (UID: \"988f0f87-7a01-4aac-a874-8e885b2f83cb\") " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.767325 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "988f0f87-7a01-4aac-a874-8e885b2f83cb" (UID: "988f0f87-7a01-4aac-a874-8e885b2f83cb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.768250 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-logs" (OuterVolumeSpecName: "logs") pod "988f0f87-7a01-4aac-a874-8e885b2f83cb" (UID: "988f0f87-7a01-4aac-a874-8e885b2f83cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.769745 5025 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="19d1598f-6725-40cd-99bd-5ee1bf699225" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.107:11211: connect: connection refused" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.772465 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "988f0f87-7a01-4aac-a874-8e885b2f83cb" (UID: "988f0f87-7a01-4aac-a874-8e885b2f83cb"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.792299 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-scripts" (OuterVolumeSpecName: "scripts") pod "988f0f87-7a01-4aac-a874-8e885b2f83cb" (UID: "988f0f87-7a01-4aac-a874-8e885b2f83cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.793396 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" (UID: "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.793684 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data" (OuterVolumeSpecName: "config-data") pod "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" (UID: "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.793952 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc9mg\" (UniqueName: \"kubernetes.io/projected/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-kube-api-access-cc9mg\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.793972 5025 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.793986 5025 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.793998 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.794008 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.794017 5025 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.794025 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.794033 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/988f0f87-7a01-4aac-a874-8e885b2f83cb-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.794042 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.794065 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.794074 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skp6n\" (UniqueName: \"kubernetes.io/projected/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-api-access-skp6n\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.794086 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.794095 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.806494 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "732b0bbc-f02e-4ced-9230-af95c9654b93" (UID: "732b0bbc-f02e-4ced-9230-af95c9654b93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.826471 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4544-account-delete-qm7lc"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.832346 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988f0f87-7a01-4aac-a874-8e885b2f83cb-kube-api-access-prpb9" (OuterVolumeSpecName: "kube-api-access-prpb9") pod "988f0f87-7a01-4aac-a874-8e885b2f83cb" (UID: "988f0f87-7a01-4aac-a874-8e885b2f83cb"). InnerVolumeSpecName "kube-api-access-prpb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.843186 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance4544-account-delete-qm7lc"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.896229 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.896599 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prpb9\" (UniqueName: \"kubernetes.io/projected/988f0f87-7a01-4aac-a874-8e885b2f83cb-kube-api-access-prpb9\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.927861 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.971272 5025 scope.go:117] "RemoveContainer" containerID="186337599c994f3ad1b9bcddc6d6042bacd8e770be7e70f367b17c08ed07fea6" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.980138 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.985695 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "732b0bbc-f02e-4ced-9230-af95c9654b93" (UID: "732b0bbc-f02e-4ced-9230-af95c9654b93"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:18 crc kubenswrapper[5025]: I1007 08:38:18.998629 5025 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b0bbc-f02e-4ced-9230-af95c9654b93-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.022465 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "988f0f87-7a01-4aac-a874-8e885b2f83cb" (UID: "988f0f87-7a01-4aac-a874-8e885b2f83cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.100426 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.133940 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" (UID: "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.142783 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.164688 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "988f0f87-7a01-4aac-a874-8e885b2f83cb" (UID: "988f0f87-7a01-4aac-a874-8e885b2f83cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.171706 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-config-data" (OuterVolumeSpecName: "config-data") pod "988f0f87-7a01-4aac-a874-8e885b2f83cb" (UID: "988f0f87-7a01-4aac-a874-8e885b2f83cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.190969 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" (UID: "fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.202917 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.202968 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.202980 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.202989 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/988f0f87-7a01-4aac-a874-8e885b2f83cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.202998 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.314332 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.318420 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.320220 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.320256 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8691f972-205f-470b-b300-40c32106704b" containerName="nova-scheduler-scheduler" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.375914 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.432451 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.432988 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.433270 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.433304 5025 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.433835 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.435377 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.453661 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.456766 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderb478-account-delete-p6rj5" Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.460967 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.461027 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.485050 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.485205 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.485266 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.500932 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.512828 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-internal-tls-certs\") pod \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.512867 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-logs\") pod \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.512896 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnpd7\" (UniqueName: \"kubernetes.io/projected/c87571c0-a1b9-4187-a60c-dd66d9fbb301-kube-api-access-tnpd7\") pod \"c87571c0-a1b9-4187-a60c-dd66d9fbb301\" (UID: \"c87571c0-a1b9-4187-a60c-dd66d9fbb301\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.512916 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-public-tls-certs\") pod \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.512941 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-combined-ca-bundle\") pod \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513038 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-public-tls-certs\") pod \"39a4256e-db59-4623-aefb-bad0c1412bf1\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513054 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-logs\") pod \"39a4256e-db59-4623-aefb-bad0c1412bf1\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513077 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-combined-ca-bundle\") pod \"39a4256e-db59-4623-aefb-bad0c1412bf1\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513107 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"39a4256e-db59-4623-aefb-bad0c1412bf1\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513163 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-config-data\") pod \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513204 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-httpd-run\") pod \"39a4256e-db59-4623-aefb-bad0c1412bf1\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513232 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dfpm\" (UniqueName: \"kubernetes.io/projected/39a4256e-db59-4623-aefb-bad0c1412bf1-kube-api-access-9dfpm\") pod \"39a4256e-db59-4623-aefb-bad0c1412bf1\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513264 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xgrg\" (UniqueName: \"kubernetes.io/projected/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-kube-api-access-4xgrg\") pod \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513292 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-scripts\") pod \"39a4256e-db59-4623-aefb-bad0c1412bf1\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513325 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-scripts\") pod \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\" (UID: \"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513348 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-config-data\") pod \"39a4256e-db59-4623-aefb-bad0c1412bf1\" (UID: \"39a4256e-db59-4623-aefb-bad0c1412bf1\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.513770 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-logs" (OuterVolumeSpecName: "logs") pod "39a4256e-db59-4623-aefb-bad0c1412bf1" (UID: "39a4256e-db59-4623-aefb-bad0c1412bf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.516211 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "39a4256e-db59-4623-aefb-bad0c1412bf1" (UID: "39a4256e-db59-4623-aefb-bad0c1412bf1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.522900 5025 generic.go:334] "Generic (PLEG): container finished" podID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerID="a4937f814a2ec01bb17a146ab80071941530367ba22e6ebb3a6adb3bf515dd0a" exitCode=0 Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.523007 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerDied","Data":"a4937f814a2ec01bb17a146ab80071941530367ba22e6ebb3a6adb3bf515dd0a"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.529183 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-kube-api-access-4xgrg" (OuterVolumeSpecName: "kube-api-access-4xgrg") pod "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" (UID: "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8"). InnerVolumeSpecName "kube-api-access-4xgrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.529555 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87571c0-a1b9-4187-a60c-dd66d9fbb301-kube-api-access-tnpd7" (OuterVolumeSpecName: "kube-api-access-tnpd7") pod "c87571c0-a1b9-4187-a60c-dd66d9fbb301" (UID: "c87571c0-a1b9-4187-a60c-dd66d9fbb301"). InnerVolumeSpecName "kube-api-access-tnpd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.530092 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xgrg\" (UniqueName: \"kubernetes.io/projected/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-kube-api-access-4xgrg\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.530112 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnpd7\" (UniqueName: \"kubernetes.io/projected/c87571c0-a1b9-4187-a60c-dd66d9fbb301-kube-api-access-tnpd7\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.530146 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.530158 5025 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a4256e-db59-4623-aefb-bad0c1412bf1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.531389 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "39a4256e-db59-4623-aefb-bad0c1412bf1" (UID: "39a4256e-db59-4623-aefb-bad0c1412bf1"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.532848 5025 generic.go:334] "Generic (PLEG): container finished" podID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerID="c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5" exitCode=0 Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.533037 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.532976 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a892055-924e-4e4f-a625-b15b8a39c4bf","Type":"ContainerDied","Data":"c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.533134 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a892055-924e-4e4f-a625-b15b8a39c4bf","Type":"ContainerDied","Data":"31a1770d51ee4ed552c5bf90847456f2b0e0c8dd3bcd6e05a7d9136c8e1ef3ff"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.533157 5025 scope.go:117] "RemoveContainer" containerID="c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.534927 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-scripts" (OuterVolumeSpecName: "scripts") pod "39a4256e-db59-4623-aefb-bad0c1412bf1" (UID: "39a4256e-db59-4623-aefb-bad0c1412bf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.543436 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"988f0f87-7a01-4aac-a874-8e885b2f83cb","Type":"ContainerDied","Data":"8c45ebf433ae81889e822f3639080aabf9b0325d083969c30181a29c8c99a405"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.543591 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.543776 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a4256e-db59-4623-aefb-bad0c1412bf1-kube-api-access-9dfpm" (OuterVolumeSpecName: "kube-api-access-9dfpm") pod "39a4256e-db59-4623-aefb-bad0c1412bf1" (UID: "39a4256e-db59-4623-aefb-bad0c1412bf1"). InnerVolumeSpecName "kube-api-access-9dfpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.553621 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-logs" (OuterVolumeSpecName: "logs") pod "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" (UID: "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.555862 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-scripts" (OuterVolumeSpecName: "scripts") pod "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" (UID: "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.556827 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"39a4256e-db59-4623-aefb-bad0c1412bf1","Type":"ContainerDied","Data":"9278e0622273f09878e7b42f19da50e6c31c60d114fabc474824e31ce4834620"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.556936 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.563287 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"732b0bbc-f02e-4ced-9230-af95c9654b93","Type":"ContainerDied","Data":"0cf91794b54f425de7d8916e93b107a79f49429b97a38621de3fc575a86d9fb9"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.563376 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.569517 5025 scope.go:117] "RemoveContainer" containerID="793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.576108 5025 generic.go:334] "Generic (PLEG): container finished" podID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerID="1d466d120650f5aa29db6aa35c48e75103701a48199f9d6ad44171efa028e65a" exitCode=0 Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.576185 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" event={"ID":"6a8356fc-f4bd-4853-a3f3-0d44ab20612b","Type":"ContainerDied","Data":"1d466d120650f5aa29db6aa35c48e75103701a48199f9d6ad44171efa028e65a"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.584051 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78","Type":"ContainerDied","Data":"6fac0f52246aa5d51796f306c60f390b41f93a5103e192574855e1fc4e5665ce"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.584158 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.611178 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-684bd87b6d-w58z5" event={"ID":"f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8","Type":"ContainerDied","Data":"eefef5616739121371aad595359fd3442cec2ea8a836a569004f8026d54f8d08"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.611245 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-684bd87b6d-w58z5" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.621681 5025 generic.go:334] "Generic (PLEG): container finished" podID="51f36c18-61cd-43d1-98a6-b569197c9382" containerID="541d0073c514c96dd102a8451f807366e43a748b3da042e4f4deb995f526e4db" exitCode=0 Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.621763 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" event={"ID":"51f36c18-61cd-43d1-98a6-b569197c9382","Type":"ContainerDied","Data":"541d0073c514c96dd102a8451f807366e43a748b3da042e4f4deb995f526e4db"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.623982 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-config-data" (OuterVolumeSpecName: "config-data") pod "39a4256e-db59-4623-aefb-bad0c1412bf1" (UID: "39a4256e-db59-4623-aefb-bad0c1412bf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.629436 5025 generic.go:334] "Generic (PLEG): container finished" podID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerID="049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38" exitCode=0 Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.629519 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.629574 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72a63b89-4f79-41a7-9e9a-e3c464d015a4","Type":"ContainerDied","Data":"049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.640745 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.631147 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-public-tls-certs\") pod \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.642475 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-config-data\") pod \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.642508 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28nn8\" (UniqueName: \"kubernetes.io/projected/19d1598f-6725-40cd-99bd-5ee1bf699225-kube-api-access-28nn8\") pod \"19d1598f-6725-40cd-99bd-5ee1bf699225\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.642530 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-nova-metadata-tls-certs\") pod \"4a892055-924e-4e4f-a625-b15b8a39c4bf\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.642577 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f66554ef-2742-4c2b-a6e4-58b8377f03fa-logs\") pod \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.642604 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data-custom\") pod \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.642988 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-public-tls-certs\") pod \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643018 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-combined-ca-bundle\") pod \"4a892055-924e-4e4f-a625-b15b8a39c4bf\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643049 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99rzk\" (UniqueName: \"kubernetes.io/projected/4a892055-924e-4e4f-a625-b15b8a39c4bf-kube-api-access-99rzk\") pod \"4a892055-924e-4e4f-a625-b15b8a39c4bf\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643095 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-combined-ca-bundle\") pod \"19d1598f-6725-40cd-99bd-5ee1bf699225\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643168 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-config-data\") pod \"4a892055-924e-4e4f-a625-b15b8a39c4bf\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643192 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-kolla-config\") pod \"19d1598f-6725-40cd-99bd-5ee1bf699225\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643247 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmbkg\" (UniqueName: \"kubernetes.io/projected/f66554ef-2742-4c2b-a6e4-58b8377f03fa-kube-api-access-wmbkg\") pod \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643276 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-memcached-tls-certs\") pod \"19d1598f-6725-40cd-99bd-5ee1bf699225\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643318 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-config-data\") pod \"19d1598f-6725-40cd-99bd-5ee1bf699225\" (UID: \"19d1598f-6725-40cd-99bd-5ee1bf699225\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643385 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data\") pod \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643409 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-combined-ca-bundle\") pod \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643427 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qn27\" (UniqueName: \"kubernetes.io/projected/72a63b89-4f79-41a7-9e9a-e3c464d015a4-kube-api-access-9qn27\") pod \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643461 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a892055-924e-4e4f-a625-b15b8a39c4bf-logs\") pod \"4a892055-924e-4e4f-a625-b15b8a39c4bf\" (UID: \"4a892055-924e-4e4f-a625-b15b8a39c4bf\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643867 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-combined-ca-bundle\") pod \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643893 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a63b89-4f79-41a7-9e9a-e3c464d015a4-logs\") pod \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643920 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-internal-tls-certs\") pod \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\" (UID: \"72a63b89-4f79-41a7-9e9a-e3c464d015a4\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.643938 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-internal-tls-certs\") pod \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\" (UID: \"f66554ef-2742-4c2b-a6e4-58b8377f03fa\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.644324 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72a63b89-4f79-41a7-9e9a-e3c464d015a4","Type":"ContainerDied","Data":"8c50e11b253e4d9a2871b39f640867158387c82511f814263844b17932593ce1"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652363 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderb478-account-delete-p6rj5" event={"ID":"c87571c0-a1b9-4187-a60c-dd66d9fbb301","Type":"ContainerDied","Data":"1628308fe842ebe329ac5dc4f3807a63f75b4e21c43dc2faafeb1c8fd7665e62"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652398 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1628308fe842ebe329ac5dc4f3807a63f75b4e21c43dc2faafeb1c8fd7665e62" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.645615 5025 scope.go:117] "RemoveContainer" containerID="c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.646105 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f66554ef-2742-4c2b-a6e4-58b8377f03fa-logs" (OuterVolumeSpecName: "logs") pod "f66554ef-2742-4c2b-a6e4-58b8377f03fa" (UID: "f66554ef-2742-4c2b-a6e4-58b8377f03fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.648666 5025 projected.go:194] Error preparing data for projected volume kube-api-access-f2k27 for pod openstack/keystone1b46-account-delete-sszts: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.652581 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27 podName:4211d4e2-eae9-4754-9bd1-42197c704bc6 nodeName:}" failed. No retries permitted until 2025-10-07 08:38:21.652562782 +0000 UTC m=+1308.461876926 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f2k27" (UniqueName: "kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27") pod "keystone1b46-account-delete-sszts" (UID: "4211d4e2-eae9-4754-9bd1-42197c704bc6") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.644659 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2k27\" (UniqueName: \"kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27\") pod \"keystone1b46-account-delete-sszts\" (UID: \"4211d4e2-eae9-4754-9bd1-42197c704bc6\") " pod="openstack/keystone1b46-account-delete-sszts" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652772 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f66554ef-2742-4c2b-a6e4-58b8377f03fa-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652785 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dfpm\" (UniqueName: \"kubernetes.io/projected/39a4256e-db59-4623-aefb-bad0c1412bf1-kube-api-access-9dfpm\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652795 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652806 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652814 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652824 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652844 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.655656 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5\": container with ID starting with c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5 not found: ID does not exist" containerID="c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.655693 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5"} err="failed to get container status \"c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5\": rpc error: code = NotFound desc = could not find container \"c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5\": container with ID starting with c9071377ba26d8b28dd8da25bf7997e2b2662f1fc5537d0f47a5f29c16759ff5 not found: ID does not exist" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.655717 5025 scope.go:117] "RemoveContainer" containerID="793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.655899 5025 generic.go:334] "Generic (PLEG): container finished" podID="073c6d0a-3fea-4e0b-8f8f-f58613b4bf23" containerID="8fadf8cac92c695c60339b695323e8872f1a70c0fa6297fd6b45f17e85f81906" exitCode=0 Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.655946 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23","Type":"ContainerDied","Data":"8fadf8cac92c695c60339b695323e8872f1a70c0fa6297fd6b45f17e85f81906"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.651032 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a892055-924e-4e4f-a625-b15b8a39c4bf-logs" (OuterVolumeSpecName: "logs") pod "4a892055-924e-4e4f-a625-b15b8a39c4bf" (UID: "4a892055-924e-4e4f-a625-b15b8a39c4bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.651433 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a63b89-4f79-41a7-9e9a-e3c464d015a4-logs" (OuterVolumeSpecName: "logs") pod "72a63b89-4f79-41a7-9e9a-e3c464d015a4" (UID: "72a63b89-4f79-41a7-9e9a-e3c464d015a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.652296 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "19d1598f-6725-40cd-99bd-5ee1bf699225" (UID: "19d1598f-6725-40cd-99bd-5ee1bf699225"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.641759 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderb478-account-delete-p6rj5" Oct 07 08:38:19 crc kubenswrapper[5025]: E1007 08:38:19.656276 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0\": container with ID starting with 793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0 not found: ID does not exist" containerID="793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.656428 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0"} err="failed to get container status \"793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0\": rpc error: code = NotFound desc = could not find container \"793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0\": container with ID starting with 793b55f252562c9ec4f26d6283e230540751d4e2f67aa0c95732eac37fff4ef0 not found: ID does not exist" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.656458 5025 scope.go:117] "RemoveContainer" containerID="4055606214f6b158a47c0d68c52aea2b1ad07643ccaa8ad47dd496cc1fca1a1e" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.660503 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-config-data" (OuterVolumeSpecName: "config-data") pod "19d1598f-6725-40cd-99bd-5ee1bf699225" (UID: "19d1598f-6725-40cd-99bd-5ee1bf699225"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.664331 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f66554ef-2742-4c2b-a6e4-58b8377f03fa" (UID: "f66554ef-2742-4c2b-a6e4-58b8377f03fa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.664365 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"19d1598f-6725-40cd-99bd-5ee1bf699225","Type":"ContainerDied","Data":"7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.664308 5025 generic.go:334] "Generic (PLEG): container finished" podID="19d1598f-6725-40cd-99bd-5ee1bf699225" containerID="7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1" exitCode=0 Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.664476 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.664606 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"19d1598f-6725-40cd-99bd-5ee1bf699225","Type":"ContainerDied","Data":"306521de0744c0f811b42c62aad2c297f03a0e0caa58bd0db54e2287bd51da46"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.691271 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39a4256e-db59-4623-aefb-bad0c1412bf1" (UID: "39a4256e-db59-4623-aefb-bad0c1412bf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.691396 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66554ef-2742-4c2b-a6e4-58b8377f03fa-kube-api-access-wmbkg" (OuterVolumeSpecName: "kube-api-access-wmbkg") pod "f66554ef-2742-4c2b-a6e4-58b8377f03fa" (UID: "f66554ef-2742-4c2b-a6e4-58b8377f03fa"). InnerVolumeSpecName "kube-api-access-wmbkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.691473 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a63b89-4f79-41a7-9e9a-e3c464d015a4-kube-api-access-9qn27" (OuterVolumeSpecName: "kube-api-access-9qn27") pod "72a63b89-4f79-41a7-9e9a-e3c464d015a4" (UID: "72a63b89-4f79-41a7-9e9a-e3c464d015a4"). InnerVolumeSpecName "kube-api-access-9qn27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.691570 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a892055-924e-4e4f-a625-b15b8a39c4bf-kube-api-access-99rzk" (OuterVolumeSpecName: "kube-api-access-99rzk") pod "4a892055-924e-4e4f-a625-b15b8a39c4bf" (UID: "4a892055-924e-4e4f-a625-b15b8a39c4bf"). InnerVolumeSpecName "kube-api-access-99rzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.698285 5025 generic.go:334] "Generic (PLEG): container finished" podID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerID="7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4" exitCode=0 Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.698469 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bb8cc8-59x6x" event={"ID":"f66554ef-2742-4c2b-a6e4-58b8377f03fa","Type":"ContainerDied","Data":"7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.698497 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56bb8cc8-59x6x" event={"ID":"f66554ef-2742-4c2b-a6e4-58b8377f03fa","Type":"ContainerDied","Data":"cbb201e2bee3dd86717d8af4d515e9a36656d5e464720ad4bcf1cea8dec2987e"} Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.698750 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56bb8cc8-59x6x" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.709818 5025 scope.go:117] "RemoveContainer" containerID="6679ef0ed68a662d16557ba47e79ca743311dae66548a6f30fab1c7702863417" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.713191 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.717973 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.718967 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell150ed-account-delete-mqxng" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.718920 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone1b46-account-delete-sszts" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.719029 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement5528-account-delete-wnxjq" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.719153 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0ee4-account-delete-8cznp" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.730355 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d1598f-6725-40cd-99bd-5ee1bf699225-kube-api-access-28nn8" (OuterVolumeSpecName: "kube-api-access-28nn8") pod "19d1598f-6725-40cd-99bd-5ee1bf699225" (UID: "19d1598f-6725-40cd-99bd-5ee1bf699225"). InnerVolumeSpecName "kube-api-access-28nn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.751526 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone1b46-account-delete-sszts" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.754974 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.755016 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99rzk\" (UniqueName: \"kubernetes.io/projected/4a892055-924e-4e4f-a625-b15b8a39c4bf-kube-api-access-99rzk\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.755037 5025 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.755048 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmbkg\" (UniqueName: \"kubernetes.io/projected/f66554ef-2742-4c2b-a6e4-58b8377f03fa-kube-api-access-wmbkg\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.755061 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19d1598f-6725-40cd-99bd-5ee1bf699225-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.755074 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qn27\" (UniqueName: \"kubernetes.io/projected/72a63b89-4f79-41a7-9e9a-e3c464d015a4-kube-api-access-9qn27\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.755090 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a892055-924e-4e4f-a625-b15b8a39c4bf-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.755103 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a63b89-4f79-41a7-9e9a-e3c464d015a4-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.755116 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.755129 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28nn8\" (UniqueName: \"kubernetes.io/projected/19d1598f-6725-40cd-99bd-5ee1bf699225-kube-api-access-28nn8\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.769130 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.769811 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.775894 5025 scope.go:117] "RemoveContainer" containerID="40bc944b02028fd2ab780bbc8f7785ef5771a301ce586feafccde56e9109c707" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.780920 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.824096 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.824154 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.831331 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72a63b89-4f79-41a7-9e9a-e3c464d015a4" (UID: "72a63b89-4f79-41a7-9e9a-e3c464d015a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.839268 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell150ed-account-delete-mqxng"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.845738 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f66554ef-2742-4c2b-a6e4-58b8377f03fa" (UID: "f66554ef-2742-4c2b-a6e4-58b8377f03fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.846285 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "39a4256e-db59-4623-aefb-bad0c1412bf1" (UID: "39a4256e-db59-4623-aefb-bad0c1412bf1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.848519 5025 scope.go:117] "RemoveContainer" containerID="577b784734dd3e48c70fc10ff3124e3ae64cd95fd597512882e35ecf5b2ba94f" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.848642 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell150ed-account-delete-mqxng"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.853263 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-config-data" (OuterVolumeSpecName: "config-data") pod "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" (UID: "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.856636 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data\") pod \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.856699 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-combined-ca-bundle\") pod \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.856737 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-logs\") pod \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.856754 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data-custom\") pod \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.856860 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcrdg\" (UniqueName: \"kubernetes.io/projected/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-kube-api-access-dcrdg\") pod \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\" (UID: \"6a8356fc-f4bd-4853-a3f3-0d44ab20612b\") " Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.858052 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.858073 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39a4256e-db59-4623-aefb-bad0c1412bf1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.858082 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.858092 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.858101 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.859699 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement5528-account-delete-wnxjq"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.860081 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-logs" (OuterVolumeSpecName: "logs") pod "6a8356fc-f4bd-4853-a3f3-0d44ab20612b" (UID: "6a8356fc-f4bd-4853-a3f3-0d44ab20612b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.868756 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement5528-account-delete-wnxjq"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.870703 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-kube-api-access-dcrdg" (OuterVolumeSpecName: "kube-api-access-dcrdg") pod "6a8356fc-f4bd-4853-a3f3-0d44ab20612b" (UID: "6a8356fc-f4bd-4853-a3f3-0d44ab20612b"). InnerVolumeSpecName "kube-api-access-dcrdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.874370 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6a8356fc-f4bd-4853-a3f3-0d44ab20612b" (UID: "6a8356fc-f4bd-4853-a3f3-0d44ab20612b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.877930 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron0ee4-account-delete-8cznp"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.884218 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron0ee4-account-delete-8cznp"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.890369 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderb478-account-delete-p6rj5"] Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.896916 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-config-data" (OuterVolumeSpecName: "config-data") pod "4a892055-924e-4e4f-a625-b15b8a39c4bf" (UID: "4a892055-924e-4e4f-a625-b15b8a39c4bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.899448 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" (UID: "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.902143 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19d1598f-6725-40cd-99bd-5ee1bf699225" (UID: "19d1598f-6725-40cd-99bd-5ee1bf699225"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.926595 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="041675fd-b2c1-4e9c-9b05-4a2aef6d329f" path="/var/lib/kubelet/pods/041675fd-b2c1-4e9c-9b05-4a2aef6d329f/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.927106 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28643c48-ee4c-4880-8901-9e3231c70b70" path="/var/lib/kubelet/pods/28643c48-ee4c-4880-8901-9e3231c70b70/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.927821 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582efaa6-0e81-462a-813f-7ba1cf9c6fc2" path="/var/lib/kubelet/pods/582efaa6-0e81-462a-813f-7ba1cf9c6fc2/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.928882 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db22b1d-c7bd-4bc9-b30b-3271da7741e1" path="/var/lib/kubelet/pods/5db22b1d-c7bd-4bc9-b30b-3271da7741e1/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.929458 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732b0bbc-f02e-4ced-9230-af95c9654b93" path="/var/lib/kubelet/pods/732b0bbc-f02e-4ced-9230-af95c9654b93/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.929952 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9740b484-8952-4253-9158-17164236ffcc" path="/var/lib/kubelet/pods/9740b484-8952-4253-9158-17164236ffcc/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.930588 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988f0f87-7a01-4aac-a874-8e885b2f83cb" path="/var/lib/kubelet/pods/988f0f87-7a01-4aac-a874-8e885b2f83cb/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.931867 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc34125-4736-4467-bf0e-ec3b211e6d13" path="/var/lib/kubelet/pods/9fc34125-4736-4467-bf0e-ec3b211e6d13/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.932441 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b956a819-1c8f-471c-a738-fe4d78fbbb98" path="/var/lib/kubelet/pods/b956a819-1c8f-471c-a738-fe4d78fbbb98/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.933375 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" path="/var/lib/kubelet/pods/fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78/volumes" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.942912 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" (UID: "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.947794 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "72a63b89-4f79-41a7-9e9a-e3c464d015a4" (UID: "72a63b89-4f79-41a7-9e9a-e3c464d015a4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.962006 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.962064 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.962079 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.962093 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.962105 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcrdg\" (UniqueName: \"kubernetes.io/projected/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-kube-api-access-dcrdg\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.962116 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.962150 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.962164 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.974640 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a892055-924e-4e4f-a625-b15b8a39c4bf" (UID: "4a892055-924e-4e4f-a625-b15b8a39c4bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.974892 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-config-data" (OuterVolumeSpecName: "config-data") pod "72a63b89-4f79-41a7-9e9a-e3c464d015a4" (UID: "72a63b89-4f79-41a7-9e9a-e3c464d015a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.981372 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f66554ef-2742-4c2b-a6e4-58b8377f03fa" (UID: "f66554ef-2742-4c2b-a6e4-58b8377f03fa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.989722 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "19d1598f-6725-40cd-99bd-5ee1bf699225" (UID: "19d1598f-6725-40cd-99bd-5ee1bf699225"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:19 crc kubenswrapper[5025]: I1007 08:38:19.993784 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f66554ef-2742-4c2b-a6e4-58b8377f03fa" (UID: "f66554ef-2742-4c2b-a6e4-58b8377f03fa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.011609 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4a892055-924e-4e4f-a625-b15b8a39c4bf" (UID: "4a892055-924e-4e4f-a625-b15b8a39c4bf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.025756 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a8356fc-f4bd-4853-a3f3-0d44ab20612b" (UID: "6a8356fc-f4bd-4853-a3f3-0d44ab20612b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.034646 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data" (OuterVolumeSpecName: "config-data") pod "f66554ef-2742-4c2b-a6e4-58b8377f03fa" (UID: "f66554ef-2742-4c2b-a6e4-58b8377f03fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.059155 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data" (OuterVolumeSpecName: "config-data") pod "6a8356fc-f4bd-4853-a3f3-0d44ab20612b" (UID: "6a8356fc-f4bd-4853-a3f3-0d44ab20612b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.064029 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.064051 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.064061 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a8356fc-f4bd-4853-a3f3-0d44ab20612b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.064069 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.064077 5025 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.064086 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.064094 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a892055-924e-4e4f-a625-b15b8a39c4bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.064102 5025 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d1598f-6725-40cd-99bd-5ee1bf699225-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.064110 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f66554ef-2742-4c2b-a6e4-58b8377f03fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.082882 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "72a63b89-4f79-41a7-9e9a-e3c464d015a4" (UID: "72a63b89-4f79-41a7-9e9a-e3c464d015a4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.085488 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" (UID: "f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.165265 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a63b89-4f79-41a7-9e9a-e3c464d015a4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.165307 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.178256 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinderb478-account-delete-p6rj5"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.187337 5025 scope.go:117] "RemoveContainer" containerID="48e5ab1102a8da55abb1bed760339663a999e777637527039fd43637c874b375" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.236698 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.246886 5025 scope.go:117] "RemoveContainer" containerID="50b0aff959782d5056be05dce7539fc1d1b6491f125fcc23557b5dc2b17bcb0a" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.255111 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.257898 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.266209 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-config-data\") pod \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.266271 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxzgb\" (UniqueName: \"kubernetes.io/projected/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-kube-api-access-fxzgb\") pod \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.266429 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-combined-ca-bundle\") pod \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\" (UID: \"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.269407 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.294458 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.296753 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.310975 5025 scope.go:117] "RemoveContainer" containerID="f3589738c71b80a2b66f3651e465e35c090e223c8027cb49f147e9e055629164" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.312758 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-kube-api-access-fxzgb" (OuterVolumeSpecName: "kube-api-access-fxzgb") pod "073c6d0a-3fea-4e0b-8f8f-f58613b4bf23" (UID: "073c6d0a-3fea-4e0b-8f8f-f58613b4bf23"). InnerVolumeSpecName "kube-api-access-fxzgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.317446 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-config-data" (OuterVolumeSpecName: "config-data") pod "073c6d0a-3fea-4e0b-8f8f-f58613b4bf23" (UID: "073c6d0a-3fea-4e0b-8f8f-f58613b4bf23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.322584 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "073c6d0a-3fea-4e0b-8f8f-f58613b4bf23" (UID: "073c6d0a-3fea-4e0b-8f8f-f58613b4bf23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.334922 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.353703 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.358396 5025 scope.go:117] "RemoveContainer" containerID="91ba013dc33b4d2f92059c4d4c66f039c2cb58683016a2882a32f59cc78bb607" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368035 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-log-httpd\") pod \"ba0f744b-7d32-4337-be25-f4f6326aaec2\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368076 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-combined-ca-bundle\") pod \"51f36c18-61cd-43d1-98a6-b569197c9382\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368105 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-combined-ca-bundle\") pod \"ba0f744b-7d32-4337-be25-f4f6326aaec2\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368122 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data-custom\") pod \"51f36c18-61cd-43d1-98a6-b569197c9382\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368144 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data\") pod \"51f36c18-61cd-43d1-98a6-b569197c9382\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368175 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2p2m\" (UniqueName: \"kubernetes.io/projected/8691f972-205f-470b-b300-40c32106704b-kube-api-access-v2p2m\") pod \"8691f972-205f-470b-b300-40c32106704b\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368218 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-config-data\") pod \"8691f972-205f-470b-b300-40c32106704b\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368250 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-ceilometer-tls-certs\") pod \"ba0f744b-7d32-4337-be25-f4f6326aaec2\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368293 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-config-data\") pod \"ba0f744b-7d32-4337-be25-f4f6326aaec2\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368310 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-scripts\") pod \"ba0f744b-7d32-4337-be25-f4f6326aaec2\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368330 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-combined-ca-bundle\") pod \"8691f972-205f-470b-b300-40c32106704b\" (UID: \"8691f972-205f-470b-b300-40c32106704b\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368399 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f36c18-61cd-43d1-98a6-b569197c9382-logs\") pod \"51f36c18-61cd-43d1-98a6-b569197c9382\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368417 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-run-httpd\") pod \"ba0f744b-7d32-4337-be25-f4f6326aaec2\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368431 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd7br\" (UniqueName: \"kubernetes.io/projected/ba0f744b-7d32-4337-be25-f4f6326aaec2-kube-api-access-bd7br\") pod \"ba0f744b-7d32-4337-be25-f4f6326aaec2\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368453 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-sg-core-conf-yaml\") pod \"ba0f744b-7d32-4337-be25-f4f6326aaec2\" (UID: \"ba0f744b-7d32-4337-be25-f4f6326aaec2\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368470 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxs84\" (UniqueName: \"kubernetes.io/projected/51f36c18-61cd-43d1-98a6-b569197c9382-kube-api-access-xxs84\") pod \"51f36c18-61cd-43d1-98a6-b569197c9382\" (UID: \"51f36c18-61cd-43d1-98a6-b569197c9382\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368770 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368783 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.368792 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxzgb\" (UniqueName: \"kubernetes.io/projected/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23-kube-api-access-fxzgb\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.370812 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f36c18-61cd-43d1-98a6-b569197c9382-logs" (OuterVolumeSpecName: "logs") pod "51f36c18-61cd-43d1-98a6-b569197c9382" (UID: "51f36c18-61cd-43d1-98a6-b569197c9382"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.373410 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f36c18-61cd-43d1-98a6-b569197c9382-kube-api-access-xxs84" (OuterVolumeSpecName: "kube-api-access-xxs84") pod "51f36c18-61cd-43d1-98a6-b569197c9382" (UID: "51f36c18-61cd-43d1-98a6-b569197c9382"). InnerVolumeSpecName "kube-api-access-xxs84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.373513 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ba0f744b-7d32-4337-be25-f4f6326aaec2" (UID: "ba0f744b-7d32-4337-be25-f4f6326aaec2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.373970 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "51f36c18-61cd-43d1-98a6-b569197c9382" (UID: "51f36c18-61cd-43d1-98a6-b569197c9382"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.373987 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba0f744b-7d32-4337-be25-f4f6326aaec2-kube-api-access-bd7br" (OuterVolumeSpecName: "kube-api-access-bd7br") pod "ba0f744b-7d32-4337-be25-f4f6326aaec2" (UID: "ba0f744b-7d32-4337-be25-f4f6326aaec2"). InnerVolumeSpecName "kube-api-access-bd7br". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.374767 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ba0f744b-7d32-4337-be25-f4f6326aaec2" (UID: "ba0f744b-7d32-4337-be25-f4f6326aaec2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.381990 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.383403 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-684bd87b6d-w58z5"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.390272 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-684bd87b6d-w58z5"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.395159 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-scripts" (OuterVolumeSpecName: "scripts") pod "ba0f744b-7d32-4337-be25-f4f6326aaec2" (UID: "ba0f744b-7d32-4337-be25-f4f6326aaec2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.397258 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.406119 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8691f972-205f-470b-b300-40c32106704b-kube-api-access-v2p2m" (OuterVolumeSpecName: "kube-api-access-v2p2m") pod "8691f972-205f-470b-b300-40c32106704b" (UID: "8691f972-205f-470b-b300-40c32106704b"). InnerVolumeSpecName "kube-api-access-v2p2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.406163 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.421828 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.423211 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ba0f744b-7d32-4337-be25-f4f6326aaec2" (UID: "ba0f744b-7d32-4337-be25-f4f6326aaec2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.434944 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.443395 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data" (OuterVolumeSpecName: "config-data") pod "51f36c18-61cd-43d1-98a6-b569197c9382" (UID: "51f36c18-61cd-43d1-98a6-b569197c9382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.444286 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51f36c18-61cd-43d1-98a6-b569197c9382" (UID: "51f36c18-61cd-43d1-98a6-b569197c9382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.452874 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8691f972-205f-470b-b300-40c32106704b" (UID: "8691f972-205f-470b-b300-40c32106704b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.469556 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kolla-config\") pod \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.469598 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-operator-scripts\") pod \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.469626 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-secrets\") pod \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.469707 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mz5b\" (UniqueName: \"kubernetes.io/projected/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kube-api-access-5mz5b\") pod \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.469752 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-default\") pod \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.469777 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-galera-tls-certs\") pod \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.469850 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-combined-ca-bundle\") pod \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.469877 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.469921 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-generated\") pod \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\" (UID: \"007c50ab-22f3-4b11-a7e9-3890acbe7e03\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470182 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "007c50ab-22f3-4b11-a7e9-3890acbe7e03" (UID: "007c50ab-22f3-4b11-a7e9-3890acbe7e03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470209 5025 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f36c18-61cd-43d1-98a6-b569197c9382-logs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470222 5025 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470232 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd7br\" (UniqueName: \"kubernetes.io/projected/ba0f744b-7d32-4337-be25-f4f6326aaec2-kube-api-access-bd7br\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470245 5025 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470254 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxs84\" (UniqueName: \"kubernetes.io/projected/51f36c18-61cd-43d1-98a6-b569197c9382-kube-api-access-xxs84\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470263 5025 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba0f744b-7d32-4337-be25-f4f6326aaec2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470271 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470279 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470287 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f36c18-61cd-43d1-98a6-b569197c9382-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470296 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2p2m\" (UniqueName: \"kubernetes.io/projected/8691f972-205f-470b-b300-40c32106704b-kube-api-access-v2p2m\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470304 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470312 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: E1007 08:38:20.470366 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:20 crc kubenswrapper[5025]: E1007 08:38:20.470409 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data podName:e5a4ae5a-0b64-481f-a54d-2263de0eae8e nodeName:}" failed. No retries permitted until 2025-10-07 08:38:28.470395553 +0000 UTC m=+1315.279709697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data") pod "rabbitmq-cell1-server-0" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e") : configmap "rabbitmq-cell1-config-data" not found Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470880 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ba0f744b-7d32-4337-be25-f4f6326aaec2" (UID: "ba0f744b-7d32-4337-be25-f4f6326aaec2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.470880 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "007c50ab-22f3-4b11-a7e9-3890acbe7e03" (UID: "007c50ab-22f3-4b11-a7e9-3890acbe7e03"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.472110 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "007c50ab-22f3-4b11-a7e9-3890acbe7e03" (UID: "007c50ab-22f3-4b11-a7e9-3890acbe7e03"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.472365 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "007c50ab-22f3-4b11-a7e9-3890acbe7e03" (UID: "007c50ab-22f3-4b11-a7e9-3890acbe7e03"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.474813 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kube-api-access-5mz5b" (OuterVolumeSpecName: "kube-api-access-5mz5b") pod "007c50ab-22f3-4b11-a7e9-3890acbe7e03" (UID: "007c50ab-22f3-4b11-a7e9-3890acbe7e03"). InnerVolumeSpecName "kube-api-access-5mz5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.482109 5025 scope.go:117] "RemoveContainer" containerID="9226a61a229bb35a76b9b1993c0b6dacb2bab79faa138dc6bf20e3d356c1fca2" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.482499 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-secrets" (OuterVolumeSpecName: "secrets") pod "007c50ab-22f3-4b11-a7e9-3890acbe7e03" (UID: "007c50ab-22f3-4b11-a7e9-3890acbe7e03"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.496675 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba0f744b-7d32-4337-be25-f4f6326aaec2" (UID: "ba0f744b-7d32-4337-be25-f4f6326aaec2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.502827 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "007c50ab-22f3-4b11-a7e9-3890acbe7e03" (UID: "007c50ab-22f3-4b11-a7e9-3890acbe7e03"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.503866 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "007c50ab-22f3-4b11-a7e9-3890acbe7e03" (UID: "007c50ab-22f3-4b11-a7e9-3890acbe7e03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.510601 5025 scope.go:117] "RemoveContainer" containerID="049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.516945 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-config-data" (OuterVolumeSpecName: "config-data") pod "8691f972-205f-470b-b300-40c32106704b" (UID: "8691f972-205f-470b-b300-40c32106704b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.517260 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56bb8cc8-59x6x"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.528709 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56bb8cc8-59x6x"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.536246 5025 scope.go:117] "RemoveContainer" containerID="231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.546788 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "007c50ab-22f3-4b11-a7e9-3890acbe7e03" (UID: "007c50ab-22f3-4b11-a7e9-3890acbe7e03"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.561416 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-config-data" (OuterVolumeSpecName: "config-data") pod "ba0f744b-7d32-4337-be25-f4f6326aaec2" (UID: "ba0f744b-7d32-4337-be25-f4f6326aaec2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.562919 5025 scope.go:117] "RemoveContainer" containerID="049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38" Oct 07 08:38:20 crc kubenswrapper[5025]: E1007 08:38:20.566612 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38\": container with ID starting with 049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38 not found: ID does not exist" containerID="049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.566661 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38"} err="failed to get container status \"049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38\": rpc error: code = NotFound desc = could not find container \"049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38\": container with ID starting with 049eca553f3f87ee8bd4b31b64602c569fc7bd0383ab4e766e48144b4bd7fa38 not found: ID does not exist" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.566694 5025 scope.go:117] "RemoveContainer" containerID="231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263" Oct 07 08:38:20 crc kubenswrapper[5025]: E1007 08:38:20.567025 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263\": container with ID starting with 231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263 not found: ID does not exist" containerID="231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.567057 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263"} err="failed to get container status \"231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263\": rpc error: code = NotFound desc = could not find container \"231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263\": container with ID starting with 231456d1858e1cef2aee2ee0a6203c2580d725cb74b9ea172142e35834648263 not found: ID does not exist" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.567075 5025 scope.go:117] "RemoveContainer" containerID="7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572091 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572118 5025 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572134 5025 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572147 5025 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572158 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691f972-205f-470b-b300-40c32106704b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572169 5025 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572180 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mz5b\" (UniqueName: \"kubernetes.io/projected/007c50ab-22f3-4b11-a7e9-3890acbe7e03-kube-api-access-5mz5b\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572191 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572199 5025 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/007c50ab-22f3-4b11-a7e9-3890acbe7e03-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572207 5025 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572216 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c50ab-22f3-4b11-a7e9-3890acbe7e03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572237 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.572681 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0f744b-7d32-4337-be25-f4f6326aaec2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.591234 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.594503 5025 scope.go:117] "RemoveContainer" containerID="7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1" Oct 07 08:38:20 crc kubenswrapper[5025]: E1007 08:38:20.594885 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1\": container with ID starting with 7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1 not found: ID does not exist" containerID="7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.594922 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1"} err="failed to get container status \"7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1\": rpc error: code = NotFound desc = could not find container \"7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1\": container with ID starting with 7148fbdd1c0e248299c7313ecafd5bd390337303adc23a3d97fe357b1062f6e1 not found: ID does not exist" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.594947 5025 scope.go:117] "RemoveContainer" containerID="7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.631132 5025 scope.go:117] "RemoveContainer" containerID="6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.653820 5025 scope.go:117] "RemoveContainer" containerID="7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4" Oct 07 08:38:20 crc kubenswrapper[5025]: E1007 08:38:20.654348 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4\": container with ID starting with 7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4 not found: ID does not exist" containerID="7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.654390 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4"} err="failed to get container status \"7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4\": rpc error: code = NotFound desc = could not find container \"7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4\": container with ID starting with 7a68b397232883f8b41408f440a7388ce6e6907c314518b807a6e0906db97ca4 not found: ID does not exist" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.654422 5025 scope.go:117] "RemoveContainer" containerID="6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa" Oct 07 08:38:20 crc kubenswrapper[5025]: E1007 08:38:20.654762 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa\": container with ID starting with 6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa not found: ID does not exist" containerID="6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.654796 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa"} err="failed to get container status \"6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa\": rpc error: code = NotFound desc = could not find container \"6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa\": container with ID starting with 6a50d194060b952445e87adbfe5b99cb78c05999f4bf5f7b6625b7135d66eafa not found: ID does not exist" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.674174 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.708715 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dd481497-9604-41b6-918a-a8ec9fd6ad92/ovn-northd/0.log" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.708794 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.738796 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"073c6d0a-3fea-4e0b-8f8f-f58613b4bf23","Type":"ContainerDied","Data":"ca6bec0365e791f9961afde97c8eb9e181b2043eded1992f8ad467c13e2a43ac"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.739162 5025 scope.go:117] "RemoveContainer" containerID="8fadf8cac92c695c60339b695323e8872f1a70c0fa6297fd6b45f17e85f81906" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.739279 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.753792 5025 generic.go:334] "Generic (PLEG): container finished" podID="8691f972-205f-470b-b300-40c32106704b" containerID="8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700" exitCode=0 Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.753845 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8691f972-205f-470b-b300-40c32106704b","Type":"ContainerDied","Data":"8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.753868 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8691f972-205f-470b-b300-40c32106704b","Type":"ContainerDied","Data":"861bed56c80e581c2932c0abae9710153d8f70c1568b37c7027b67d001030917"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.753907 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.773288 5025 generic.go:334] "Generic (PLEG): container finished" podID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" containerID="7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8" exitCode=0 Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.773333 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"007c50ab-22f3-4b11-a7e9-3890acbe7e03","Type":"ContainerDied","Data":"7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.778409 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"007c50ab-22f3-4b11-a7e9-3890acbe7e03","Type":"ContainerDied","Data":"3bf4a6b44cf327941278ecd9e4d8995e1457287a1f40777d3444505e068c5c94"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.778688 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" event={"ID":"51f36c18-61cd-43d1-98a6-b569197c9382","Type":"ContainerDied","Data":"25dae93d3e9a16a849ae543e7e7ae79456846f66e20eb8989152f27dc41f9228"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.775030 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "dd481497-9604-41b6-918a-a8ec9fd6ad92" (UID: "dd481497-9604-41b6-918a-a8ec9fd6ad92"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.776821 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5495b78bc8-wbf5t" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.773355 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.774696 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-rundir\") pod \"dd481497-9604-41b6-918a-a8ec9fd6ad92\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.779332 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-config\") pod \"dd481497-9604-41b6-918a-a8ec9fd6ad92\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.779375 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-northd-tls-certs\") pod \"dd481497-9604-41b6-918a-a8ec9fd6ad92\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.779407 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khlvp\" (UniqueName: \"kubernetes.io/projected/dd481497-9604-41b6-918a-a8ec9fd6ad92-kube-api-access-khlvp\") pod \"dd481497-9604-41b6-918a-a8ec9fd6ad92\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.779445 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-metrics-certs-tls-certs\") pod \"dd481497-9604-41b6-918a-a8ec9fd6ad92\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.779480 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-scripts\") pod \"dd481497-9604-41b6-918a-a8ec9fd6ad92\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.779527 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-combined-ca-bundle\") pod \"dd481497-9604-41b6-918a-a8ec9fd6ad92\" (UID: \"dd481497-9604-41b6-918a-a8ec9fd6ad92\") " Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.779987 5025 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.784152 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-scripts" (OuterVolumeSpecName: "scripts") pod "dd481497-9604-41b6-918a-a8ec9fd6ad92" (UID: "dd481497-9604-41b6-918a-a8ec9fd6ad92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.784752 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-config" (OuterVolumeSpecName: "config") pod "dd481497-9604-41b6-918a-a8ec9fd6ad92" (UID: "dd481497-9604-41b6-918a-a8ec9fd6ad92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.794931 5025 scope.go:117] "RemoveContainer" containerID="8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.795464 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd481497-9604-41b6-918a-a8ec9fd6ad92-kube-api-access-khlvp" (OuterVolumeSpecName: "kube-api-access-khlvp") pod "dd481497-9604-41b6-918a-a8ec9fd6ad92" (UID: "dd481497-9604-41b6-918a-a8ec9fd6ad92"). InnerVolumeSpecName "kube-api-access-khlvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.808987 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.824633 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.824844 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd481497-9604-41b6-918a-a8ec9fd6ad92" (UID: "dd481497-9604-41b6-918a-a8ec9fd6ad92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.846295 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.852286 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba0f744b-7d32-4337-be25-f4f6326aaec2","Type":"ContainerDied","Data":"69c77d646d60cef40121eee839e1d3d2a631574a1f61722adde93696addfd418"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.852438 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.860477 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.875971 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dd481497-9604-41b6-918a-a8ec9fd6ad92/ovn-northd/0.log" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.876011 5025 generic.go:334] "Generic (PLEG): container finished" podID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerID="ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7" exitCode=139 Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.876054 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dd481497-9604-41b6-918a-a8ec9fd6ad92","Type":"ContainerDied","Data":"ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.876079 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dd481497-9604-41b6-918a-a8ec9fd6ad92","Type":"ContainerDied","Data":"676f8f36d75f61b1116c182d98be523441bd0a5a39ef78e376a8037326db1f1e"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.876154 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.881061 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.881089 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khlvp\" (UniqueName: \"kubernetes.io/projected/dd481497-9604-41b6-918a-a8ec9fd6ad92-kube-api-access-khlvp\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.881102 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd481497-9604-41b6-918a-a8ec9fd6ad92-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.881111 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.883285 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone1b46-account-delete-sszts" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.883685 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.885017 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8774d5b7-qj8g5" event={"ID":"6a8356fc-f4bd-4853-a3f3-0d44ab20612b","Type":"ContainerDied","Data":"ecf9c0eb66f8b326535013cd9b64e868308df29c17feb09d2ddbf3f6f14c7fdc"} Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.888528 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "dd481497-9604-41b6-918a-a8ec9fd6ad92" (UID: "dd481497-9604-41b6-918a-a8ec9fd6ad92"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.899469 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "dd481497-9604-41b6-918a-a8ec9fd6ad92" (UID: "dd481497-9604-41b6-918a-a8ec9fd6ad92"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.982343 5025 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.982380 5025 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd481497-9604-41b6-918a-a8ec9fd6ad92-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.998691 5025 scope.go:117] "RemoveContainer" containerID="8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700" Oct 07 08:38:20 crc kubenswrapper[5025]: E1007 08:38:20.999034 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700\": container with ID starting with 8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700 not found: ID does not exist" containerID="8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.999071 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700"} err="failed to get container status \"8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700\": rpc error: code = NotFound desc = could not find container \"8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700\": container with ID starting with 8b00f65fe97b0c9eded824f24e8ddfa13e3e54ab7d2347fbe976f327251fe700 not found: ID does not exist" Oct 07 08:38:20 crc kubenswrapper[5025]: I1007 08:38:20.999087 5025 scope.go:117] "RemoveContainer" containerID="7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.015926 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.025815 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.030685 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.033238 5025 scope.go:117] "RemoveContainer" containerID="aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.043004 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.068219 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone1b46-account-delete-sszts"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.074427 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone1b46-account-delete-sszts"] Oct 07 08:38:21 crc kubenswrapper[5025]: E1007 08:38:21.083816 5025 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 08:38:21 crc kubenswrapper[5025]: E1007 08:38:21.083879 5025 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data podName:d46577dd-b38b-4b80-ad57-577629e648b8 nodeName:}" failed. No retries permitted until 2025-10-07 08:38:29.083865199 +0000 UTC m=+1315.893179343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data") pod "rabbitmq-server-0" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8") : configmap "rabbitmq-config-data" not found Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.086930 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7c8774d5b7-qj8g5"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.090126 5025 scope.go:117] "RemoveContainer" containerID="7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8" Oct 07 08:38:21 crc kubenswrapper[5025]: E1007 08:38:21.090485 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8\": container with ID starting with 7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8 not found: ID does not exist" containerID="7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.090510 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8"} err="failed to get container status \"7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8\": rpc error: code = NotFound desc = could not find container \"7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8\": container with ID starting with 7aac254398470b37b4db38fadefb4db89196365fd7352f5a4e2ee90ac0aa37b8 not found: ID does not exist" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.090528 5025 scope.go:117] "RemoveContainer" containerID="aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f" Oct 07 08:38:21 crc kubenswrapper[5025]: E1007 08:38:21.090759 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f\": container with ID starting with aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f not found: ID does not exist" containerID="aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.090779 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f"} err="failed to get container status \"aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f\": rpc error: code = NotFound desc = could not find container \"aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f\": container with ID starting with aba1ffe1ebad18949637fe137e6f2b9ad04cbb81bbe6f3e97644dbcb4b80f98f not found: ID does not exist" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.090791 5025 scope.go:117] "RemoveContainer" containerID="541d0073c514c96dd102a8451f807366e43a748b3da042e4f4deb995f526e4db" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.095895 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7c8774d5b7-qj8g5"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.097680 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5495b78bc8-wbf5t"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.102216 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5495b78bc8-wbf5t"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.145438 5025 scope.go:117] "RemoveContainer" containerID="b7e10267acefb906d0890b494854e97353644178df770bee745cfcda0052c6b4" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.171528 5025 scope.go:117] "RemoveContainer" containerID="a1a35e09f930c853a6483e0e5e192b2b3f584932be09beeb5bd496d456b573a1" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.185198 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2k27\" (UniqueName: \"kubernetes.io/projected/4211d4e2-eae9-4754-9bd1-42197c704bc6-kube-api-access-f2k27\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.215364 5025 scope.go:117] "RemoveContainer" containerID="7b52de27036b56d8d85f64dd500bd9547ec47d9718e92cc57aa2c682ce41fe24" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.218721 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.224850 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.230928 5025 scope.go:117] "RemoveContainer" containerID="a4937f814a2ec01bb17a146ab80071941530367ba22e6ebb3a6adb3bf515dd0a" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.249291 5025 scope.go:117] "RemoveContainer" containerID="0d4721c409ed563195fffda2f9874618fd3f3cc7746c05dc5decaf962347f14b" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.296258 5025 scope.go:117] "RemoveContainer" containerID="abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.338248 5025 scope.go:117] "RemoveContainer" containerID="ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.359754 5025 scope.go:117] "RemoveContainer" containerID="abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede" Oct 07 08:38:21 crc kubenswrapper[5025]: E1007 08:38:21.361805 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede\": container with ID starting with abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede not found: ID does not exist" containerID="abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.361847 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede"} err="failed to get container status \"abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede\": rpc error: code = NotFound desc = could not find container \"abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede\": container with ID starting with abdf43c41da1eadb0bcc52432e2289bb3f13803711e150dbffe1f1e1034c1ede not found: ID does not exist" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.361876 5025 scope.go:117] "RemoveContainer" containerID="ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7" Oct 07 08:38:21 crc kubenswrapper[5025]: E1007 08:38:21.362281 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7\": container with ID starting with ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7 not found: ID does not exist" containerID="ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.362312 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7"} err="failed to get container status \"ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7\": rpc error: code = NotFound desc = could not find container \"ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7\": container with ID starting with ec2f854063009f0fa5535dfd83f2d60df934fb8c26cc1fa50754cfaf068846d7 not found: ID does not exist" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.362331 5025 scope.go:117] "RemoveContainer" containerID="1d466d120650f5aa29db6aa35c48e75103701a48199f9d6ad44171efa028e65a" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.381177 5025 scope.go:117] "RemoveContainer" containerID="878aa4d7a336162ac22f34d19599bf58c73742f602e586a96453e05637fc3934" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.446107 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.489751 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-fernet-keys\") pod \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.489800 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-internal-tls-certs\") pod \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.489920 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-config-data\") pod \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.489945 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-scripts\") pod \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.490008 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsjnd\" (UniqueName: \"kubernetes.io/projected/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-kube-api-access-tsjnd\") pod \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.490091 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-credential-keys\") pod \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.490123 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-combined-ca-bundle\") pod \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.490145 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-public-tls-certs\") pod \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\" (UID: \"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.495189 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" (UID: "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.495445 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-kube-api-access-tsjnd" (OuterVolumeSpecName: "kube-api-access-tsjnd") pod "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" (UID: "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef"). InnerVolumeSpecName "kube-api-access-tsjnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.496674 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-scripts" (OuterVolumeSpecName: "scripts") pod "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" (UID: "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.498381 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" (UID: "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.515394 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-config-data" (OuterVolumeSpecName: "config-data") pod "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" (UID: "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.528629 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" (UID: "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.556842 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" (UID: "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.572106 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" (UID: "a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.593505 5025 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.593535 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.593561 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.593569 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.593579 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsjnd\" (UniqueName: \"kubernetes.io/projected/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-kube-api-access-tsjnd\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.593587 5025 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.593595 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.593604 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.833495 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909530 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-erlang-cookie-secret\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909631 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909669 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-confd\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909708 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909743 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-erlang-cookie\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909780 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-plugins-conf\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909810 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-server-conf\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909844 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-plugins\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909864 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-tls\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909940 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-pod-info\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.909962 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xprqv\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-kube-api-access-xprqv\") pod \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\" (UID: \"e5a4ae5a-0b64-481f-a54d-2263de0eae8e\") " Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.918283 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.918764 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.919218 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.919282 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.919319 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.919435 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.923832 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-pod-info" (OuterVolumeSpecName: "pod-info") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.924229 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-kube-api-access-xprqv" (OuterVolumeSpecName: "kube-api-access-xprqv") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "kube-api-access-xprqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.930600 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" path="/var/lib/kubelet/pods/007c50ab-22f3-4b11-a7e9-3890acbe7e03/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.931238 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073c6d0a-3fea-4e0b-8f8f-f58613b4bf23" path="/var/lib/kubelet/pods/073c6d0a-3fea-4e0b-8f8f-f58613b4bf23/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.931805 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d1598f-6725-40cd-99bd-5ee1bf699225" path="/var/lib/kubelet/pods/19d1598f-6725-40cd-99bd-5ee1bf699225/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.934281 5025 generic.go:334] "Generic (PLEG): container finished" podID="a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" containerID="765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c" exitCode=0 Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.934558 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.934866 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a4256e-db59-4623-aefb-bad0c1412bf1" path="/var/lib/kubelet/pods/39a4256e-db59-4623-aefb-bad0c1412bf1/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.935476 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4211d4e2-eae9-4754-9bd1-42197c704bc6" path="/var/lib/kubelet/pods/4211d4e2-eae9-4754-9bd1-42197c704bc6/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.935840 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" path="/var/lib/kubelet/pods/4a892055-924e-4e4f-a625-b15b8a39c4bf/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.936930 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f36c18-61cd-43d1-98a6-b569197c9382" path="/var/lib/kubelet/pods/51f36c18-61cd-43d1-98a6-b569197c9382/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.937507 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" path="/var/lib/kubelet/pods/6a8356fc-f4bd-4853-a3f3-0d44ab20612b/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.938127 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data" (OuterVolumeSpecName: "config-data") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.938357 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" path="/var/lib/kubelet/pods/72a63b89-4f79-41a7-9e9a-e3c464d015a4/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.947507 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8691f972-205f-470b-b300-40c32106704b" path="/var/lib/kubelet/pods/8691f972-205f-470b-b300-40c32106704b/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.949786 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" path="/var/lib/kubelet/pods/ba0f744b-7d32-4337-be25-f4f6326aaec2/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.950648 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87571c0-a1b9-4187-a60c-dd66d9fbb301" path="/var/lib/kubelet/pods/c87571c0-a1b9-4187-a60c-dd66d9fbb301/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.952336 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" path="/var/lib/kubelet/pods/dd481497-9604-41b6-918a-a8ec9fd6ad92/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.953749 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" path="/var/lib/kubelet/pods/f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.955240 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" path="/var/lib/kubelet/pods/f66554ef-2742-4c2b-a6e4-58b8377f03fa/volumes" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.957356 5025 generic.go:334] "Generic (PLEG): container finished" podID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" containerID="36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef" exitCode=0 Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.957627 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 08:38:21 crc kubenswrapper[5025]: I1007 08:38:21.969824 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-server-conf" (OuterVolumeSpecName: "server-conf") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.002935 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e5a4ae5a-0b64-481f-a54d-2263de0eae8e" (UID: "e5a4ae5a-0b64-481f-a54d-2263de0eae8e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.011998 5025 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012029 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012078 5025 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012105 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012116 5025 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012130 5025 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012171 5025 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012182 5025 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012192 5025 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012203 5025 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.012214 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xprqv\" (UniqueName: \"kubernetes.io/projected/e5a4ae5a-0b64-481f-a54d-2263de0eae8e-kube-api-access-xprqv\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.034936 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.113803 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.120335 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f574885d6-269v8" event={"ID":"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef","Type":"ContainerDied","Data":"765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c"} Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.120381 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f574885d6-269v8" event={"ID":"a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef","Type":"ContainerDied","Data":"f12618ad34f2d0b545e6a747fc1c02cb8856d3c51ea493bbea188fd7690bd34e"} Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.120410 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5a4ae5a-0b64-481f-a54d-2263de0eae8e","Type":"ContainerDied","Data":"36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef"} Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.120425 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5a4ae5a-0b64-481f-a54d-2263de0eae8e","Type":"ContainerDied","Data":"51d681999300e850220197bc7751e646ddb9d25bcc355393219c1ada22f907ed"} Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.120447 5025 scope.go:117] "RemoveContainer" containerID="765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.157966 5025 scope.go:117] "RemoveContainer" containerID="765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c" Oct 07 08:38:22 crc kubenswrapper[5025]: E1007 08:38:22.163927 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c\": container with ID starting with 765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c not found: ID does not exist" containerID="765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.163988 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c"} err="failed to get container status \"765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c\": rpc error: code = NotFound desc = could not find container \"765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c\": container with ID starting with 765cf6b91f229f95c44f7ed7fc5d4c222450bb09dc9b1216f217141cfe04545c not found: ID does not exist" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.164018 5025 scope.go:117] "RemoveContainer" containerID="36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.198325 5025 scope.go:117] "RemoveContainer" containerID="058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.221738 5025 scope.go:117] "RemoveContainer" containerID="36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef" Oct 07 08:38:22 crc kubenswrapper[5025]: E1007 08:38:22.222240 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef\": container with ID starting with 36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef not found: ID does not exist" containerID="36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.222288 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef"} err="failed to get container status \"36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef\": rpc error: code = NotFound desc = could not find container \"36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef\": container with ID starting with 36236520dcf8fa30b3e1930eff40a80e74047bab4d5ce7c7754f7977352da9ef not found: ID does not exist" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.222309 5025 scope.go:117] "RemoveContainer" containerID="058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115" Oct 07 08:38:22 crc kubenswrapper[5025]: E1007 08:38:22.222705 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115\": container with ID starting with 058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115 not found: ID does not exist" containerID="058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.222727 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115"} err="failed to get container status \"058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115\": rpc error: code = NotFound desc = could not find container \"058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115\": container with ID starting with 058a5de955c114ee2c0505ce1d7a528c8b35245bc45298934de08a29873f5115 not found: ID does not exist" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.258164 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.307700 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.311225 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316284 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316352 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-erlang-cookie\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316409 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54kct\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-kube-api-access-54kct\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316450 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d46577dd-b38b-4b80-ad57-577629e648b8-erlang-cookie-secret\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316518 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-plugins\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316604 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-confd\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316646 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-server-conf\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316673 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d46577dd-b38b-4b80-ad57-577629e648b8-pod-info\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316699 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316719 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-plugins-conf\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.316736 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-tls\") pod \"d46577dd-b38b-4b80-ad57-577629e648b8\" (UID: \"d46577dd-b38b-4b80-ad57-577629e648b8\") " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.317056 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.317305 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.320287 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.320356 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-kube-api-access-54kct" (OuterVolumeSpecName: "kube-api-access-54kct") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "kube-api-access-54kct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.320744 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.320813 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.331313 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d46577dd-b38b-4b80-ad57-577629e648b8-pod-info" (OuterVolumeSpecName: "pod-info") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.340324 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d46577dd-b38b-4b80-ad57-577629e648b8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.344986 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data" (OuterVolumeSpecName: "config-data") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.370716 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-server-conf" (OuterVolumeSpecName: "server-conf") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.400867 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d46577dd-b38b-4b80-ad57-577629e648b8" (UID: "d46577dd-b38b-4b80-ad57-577629e648b8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.418966 5025 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419007 5025 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419024 5025 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419034 5025 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d46577dd-b38b-4b80-ad57-577629e648b8-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419075 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419088 5025 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419099 5025 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419111 5025 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d46577dd-b38b-4b80-ad57-577629e648b8-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419125 5025 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d46577dd-b38b-4b80-ad57-577629e648b8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419138 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54kct\" (UniqueName: \"kubernetes.io/projected/d46577dd-b38b-4b80-ad57-577629e648b8-kube-api-access-54kct\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.419150 5025 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d46577dd-b38b-4b80-ad57-577629e648b8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.433787 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.530497 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.975763 5025 generic.go:334] "Generic (PLEG): container finished" podID="d46577dd-b38b-4b80-ad57-577629e648b8" containerID="73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431" exitCode=0 Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.975836 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d46577dd-b38b-4b80-ad57-577629e648b8","Type":"ContainerDied","Data":"73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431"} Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.976290 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d46577dd-b38b-4b80-ad57-577629e648b8","Type":"ContainerDied","Data":"b2b0b0fc44802fdc90d87c910fb92166510eeeddfd8898e51afe656526841124"} Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.975928 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.976342 5025 scope.go:117] "RemoveContainer" containerID="73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431" Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.982455 5025 generic.go:334] "Generic (PLEG): container finished" podID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerID="d0cadcb14b122e4544f81e647b5313ffc159d240c42a8c28574e6aa18ad57dbf" exitCode=0 Oct 07 08:38:22 crc kubenswrapper[5025]: I1007 08:38:22.982513 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899768569-tz2vj" event={"ID":"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0","Type":"ContainerDied","Data":"d0cadcb14b122e4544f81e647b5313ffc159d240c42a8c28574e6aa18ad57dbf"} Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.036121 5025 scope.go:117] "RemoveContainer" containerID="17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.036812 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.048475 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.107920 5025 scope.go:117] "RemoveContainer" containerID="73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431" Oct 07 08:38:23 crc kubenswrapper[5025]: E1007 08:38:23.108456 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431\": container with ID starting with 73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431 not found: ID does not exist" containerID="73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.108496 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431"} err="failed to get container status \"73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431\": rpc error: code = NotFound desc = could not find container \"73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431\": container with ID starting with 73c121fe654fa077d5329867a73a7be6e5bdaa3741b8e1893d602e776c660431 not found: ID does not exist" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.108523 5025 scope.go:117] "RemoveContainer" containerID="17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e" Oct 07 08:38:23 crc kubenswrapper[5025]: E1007 08:38:23.108910 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e\": container with ID starting with 17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e not found: ID does not exist" containerID="17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.108943 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e"} err="failed to get container status \"17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e\": rpc error: code = NotFound desc = could not find container \"17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e\": container with ID starting with 17e3123b66e2436655a1e556aca3840177699357ec8d9644216f4e1e376db84e not found: ID does not exist" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.323591 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.441184 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-internal-tls-certs\") pod \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.441264 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-combined-ca-bundle\") pod \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.441300 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-ovndb-tls-certs\") pod \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.441333 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-public-tls-certs\") pod \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.441416 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-config\") pod \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.441460 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skk72\" (UniqueName: \"kubernetes.io/projected/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-kube-api-access-skk72\") pod \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.441583 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-httpd-config\") pod \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\" (UID: \"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0\") " Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.446578 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" (UID: "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.450747 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-kube-api-access-skk72" (OuterVolumeSpecName: "kube-api-access-skk72") pod "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" (UID: "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0"). InnerVolumeSpecName "kube-api-access-skk72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.479307 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-config" (OuterVolumeSpecName: "config") pod "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" (UID: "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.479713 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" (UID: "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.481789 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" (UID: "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.497904 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" (UID: "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.502374 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" (UID: "fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.543602 5025 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.543666 5025 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.543676 5025 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.543684 5025 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.543696 5025 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.543707 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skk72\" (UniqueName: \"kubernetes.io/projected/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-kube-api-access-skk72\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.543716 5025 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.930207 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46577dd-b38b-4b80-ad57-577629e648b8" path="/var/lib/kubelet/pods/d46577dd-b38b-4b80-ad57-577629e648b8/volumes" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.930965 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" path="/var/lib/kubelet/pods/e5a4ae5a-0b64-481f-a54d-2263de0eae8e/volumes" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.997156 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899768569-tz2vj" event={"ID":"fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0","Type":"ContainerDied","Data":"4e3ca449f35b47e8fffe908f1a523367a196377df1605e518e8e83e7428304e3"} Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.997202 5025 scope.go:117] "RemoveContainer" containerID="76af13f38bf57b12ea10ac6d1ec8e591e15b727a96e03aeaffeda063e26481de" Oct 07 08:38:23 crc kubenswrapper[5025]: I1007 08:38:23.997291 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5899768569-tz2vj" Oct 07 08:38:24 crc kubenswrapper[5025]: I1007 08:38:24.019157 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5899768569-tz2vj"] Oct 07 08:38:24 crc kubenswrapper[5025]: I1007 08:38:24.024771 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5899768569-tz2vj"] Oct 07 08:38:24 crc kubenswrapper[5025]: I1007 08:38:24.025491 5025 scope.go:117] "RemoveContainer" containerID="d0cadcb14b122e4544f81e647b5313ffc159d240c42a8c28574e6aa18ad57dbf" Oct 07 08:38:24 crc kubenswrapper[5025]: E1007 08:38:24.432553 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:24 crc kubenswrapper[5025]: E1007 08:38:24.433359 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:24 crc kubenswrapper[5025]: E1007 08:38:24.434488 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:24 crc kubenswrapper[5025]: E1007 08:38:24.434559 5025 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" Oct 07 08:38:24 crc kubenswrapper[5025]: E1007 08:38:24.436319 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:24 crc kubenswrapper[5025]: E1007 08:38:24.438275 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:24 crc kubenswrapper[5025]: E1007 08:38:24.440161 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:24 crc kubenswrapper[5025]: E1007 08:38:24.440217 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" Oct 07 08:38:25 crc kubenswrapper[5025]: I1007 08:38:25.924892 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" path="/var/lib/kubelet/pods/fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0/volumes" Oct 07 08:38:29 crc kubenswrapper[5025]: E1007 08:38:29.432486 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:29 crc kubenswrapper[5025]: E1007 08:38:29.433285 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:29 crc kubenswrapper[5025]: E1007 08:38:29.433574 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:29 crc kubenswrapper[5025]: E1007 08:38:29.433634 5025 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" Oct 07 08:38:29 crc kubenswrapper[5025]: E1007 08:38:29.433753 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:29 crc kubenswrapper[5025]: E1007 08:38:29.447967 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:29 crc kubenswrapper[5025]: E1007 08:38:29.457416 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:29 crc kubenswrapper[5025]: E1007 08:38:29.457484 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" Oct 07 08:38:34 crc kubenswrapper[5025]: E1007 08:38:34.433311 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:34 crc kubenswrapper[5025]: E1007 08:38:34.435157 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:34 crc kubenswrapper[5025]: E1007 08:38:34.435183 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:34 crc kubenswrapper[5025]: E1007 08:38:34.435514 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:34 crc kubenswrapper[5025]: E1007 08:38:34.435564 5025 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" Oct 07 08:38:34 crc kubenswrapper[5025]: E1007 08:38:34.438732 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:34 crc kubenswrapper[5025]: E1007 08:38:34.440503 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:34 crc kubenswrapper[5025]: E1007 08:38:34.440585 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" Oct 07 08:38:39 crc kubenswrapper[5025]: E1007 08:38:39.432795 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:39 crc kubenswrapper[5025]: E1007 08:38:39.434144 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:39 crc kubenswrapper[5025]: E1007 08:38:39.435027 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:39 crc kubenswrapper[5025]: E1007 08:38:39.435332 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:39 crc kubenswrapper[5025]: E1007 08:38:39.435478 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:39 crc kubenswrapper[5025]: E1007 08:38:39.435617 5025 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" Oct 07 08:38:39 crc kubenswrapper[5025]: E1007 08:38:39.436512 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:39 crc kubenswrapper[5025]: E1007 08:38:39.436563 5025 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.185689 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lms8w_76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b/ovs-vswitchd/0.log" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.187003 5025 generic.go:334] "Generic (PLEG): container finished" podID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" exitCode=137 Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.187052 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lms8w" event={"ID":"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b","Type":"ContainerDied","Data":"0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac"} Oct 07 08:38:44 crc kubenswrapper[5025]: E1007 08:38:44.432588 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:44 crc kubenswrapper[5025]: E1007 08:38:44.432591 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac is running failed: container process not found" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:44 crc kubenswrapper[5025]: E1007 08:38:44.432967 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac is running failed: container process not found" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:44 crc kubenswrapper[5025]: E1007 08:38:44.432988 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:44 crc kubenswrapper[5025]: E1007 08:38:44.433412 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 08:38:44 crc kubenswrapper[5025]: E1007 08:38:44.433451 5025 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" Oct 07 08:38:44 crc kubenswrapper[5025]: E1007 08:38:44.433490 5025 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac is running failed: container process not found" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 08:38:44 crc kubenswrapper[5025]: E1007 08:38:44.433519 5025 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lms8w" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.533305 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lms8w_76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b/ovs-vswitchd/0.log" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.534150 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.648903 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-log\") pod \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.648983 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-log" (OuterVolumeSpecName: "var-log") pod "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" (UID: "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.649490 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-run\") pod \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.649502 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-run" (OuterVolumeSpecName: "var-run") pod "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" (UID: "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.649647 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-etc-ovs\") pod \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.649801 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-lib\") pod \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.649853 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-scripts\") pod \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.649870 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-lib" (OuterVolumeSpecName: "var-lib") pod "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" (UID: "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.649819 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" (UID: "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.650040 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g85bb\" (UniqueName: \"kubernetes.io/projected/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-kube-api-access-g85bb\") pod \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\" (UID: \"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b\") " Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.651014 5025 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-lib\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.651106 5025 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-log\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.651182 5025 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.651259 5025 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.651818 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-scripts" (OuterVolumeSpecName: "scripts") pod "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" (UID: "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.656529 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-kube-api-access-g85bb" (OuterVolumeSpecName: "kube-api-access-g85bb") pod "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" (UID: "76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b"). InnerVolumeSpecName "kube-api-access-g85bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.752118 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g85bb\" (UniqueName: \"kubernetes.io/projected/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-kube-api-access-g85bb\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:44 crc kubenswrapper[5025]: I1007 08:38:44.752155 5025 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.195999 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lms8w_76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b/ovs-vswitchd/0.log" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.197066 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lms8w" event={"ID":"76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b","Type":"ContainerDied","Data":"38b34840d30687489771c62719a5b494f8356d6c64c64ddd50a27149c0dd60d7"} Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.197105 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lms8w" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.197130 5025 scope.go:117] "RemoveContainer" containerID="0929ddc4d9ba4610db985a4406a4161fc27bb318c68977125005e9b777d1e0ac" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.210138 5025 generic.go:334] "Generic (PLEG): container finished" podID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerID="2af861f7f90c2cae8a8ce332baaee8d44d3ff01b9e44974ad1dfe79434f2684b" exitCode=137 Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.210188 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"2af861f7f90c2cae8a8ce332baaee8d44d3ff01b9e44974ad1dfe79434f2684b"} Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.221029 5025 scope.go:117] "RemoveContainer" containerID="b49c2c24aaa3c06dfaa088d160cbeffb506d46373bed58aa7224f155e0405e49" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.233700 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-lms8w"] Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.238101 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-lms8w"] Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.294806 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.299619 5025 scope.go:117] "RemoveContainer" containerID="25b1b4a37d5c5d5383c1c84c83a961dfa99f38923300c6f52f2fb768bcc4765f" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.358575 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") pod \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.358643 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.358788 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-cache\") pod \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.358807 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26x8x\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-kube-api-access-26x8x\") pod \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.358833 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-lock\") pod \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\" (UID: \"9d7aa421-d9db-4ba6-882a-432d2e8b840f\") " Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.359197 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-cache" (OuterVolumeSpecName: "cache") pod "9d7aa421-d9db-4ba6-882a-432d2e8b840f" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.359731 5025 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-cache\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.359817 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-lock" (OuterVolumeSpecName: "lock") pod "9d7aa421-d9db-4ba6-882a-432d2e8b840f" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.362092 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "9d7aa421-d9db-4ba6-882a-432d2e8b840f" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.362737 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9d7aa421-d9db-4ba6-882a-432d2e8b840f" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.362921 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-kube-api-access-26x8x" (OuterVolumeSpecName: "kube-api-access-26x8x") pod "9d7aa421-d9db-4ba6-882a-432d2e8b840f" (UID: "9d7aa421-d9db-4ba6-882a-432d2e8b840f"). InnerVolumeSpecName "kube-api-access-26x8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.460795 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26x8x\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-kube-api-access-26x8x\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.460826 5025 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9d7aa421-d9db-4ba6-882a-432d2e8b840f-lock\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.460836 5025 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d7aa421-d9db-4ba6-882a-432d2e8b840f-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.460866 5025 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.475017 5025 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.562023 5025 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 07 08:38:45 crc kubenswrapper[5025]: I1007 08:38:45.924926 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" path="/var/lib/kubelet/pods/76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b/volumes" Oct 07 08:38:46 crc kubenswrapper[5025]: E1007 08:38:46.077780 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice/crio-06959559e1f7691a3fdc68c518f2fed5c35975781aa36606814617ec5e2bc943\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d7aa421_d9db_4ba6_882a_432d2e8b840f.slice\": RecentStats: unable to find data in memory cache]" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.226949 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d7aa421-d9db-4ba6-882a-432d2e8b840f","Type":"ContainerDied","Data":"06959559e1f7691a3fdc68c518f2fed5c35975781aa36606814617ec5e2bc943"} Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.227018 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.227271 5025 scope.go:117] "RemoveContainer" containerID="2af861f7f90c2cae8a8ce332baaee8d44d3ff01b9e44974ad1dfe79434f2684b" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.252262 5025 scope.go:117] "RemoveContainer" containerID="9d3a9df9333411c34fcd1b1b33c246c5dc54f969d8f7b94b07b7feea250b14fb" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.284743 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.289599 5025 scope.go:117] "RemoveContainer" containerID="a89cc0bbd617b24a619834b36ce895c374cd0e5df0288f882c9728d9252cd609" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.292596 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.312387 5025 scope.go:117] "RemoveContainer" containerID="17e6f1244bd23756c57aa4df7196524e152e5a068f7ec5464a9d47135f222dc7" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.361244 5025 scope.go:117] "RemoveContainer" containerID="99ce4f2eebf24b560f31691c8f1a2da1a5df6b8c8ed3d6626051ed204aea839a" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.395658 5025 scope.go:117] "RemoveContainer" containerID="aff21227028210b4f08d1a87f2fe84e1d162230abc5f64cc544641bfc6120b15" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.424980 5025 scope.go:117] "RemoveContainer" containerID="5f05c90b85b312fbeb24954f3c57716c7cd5932e4f7ceedff1daaffd861ae631" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.453993 5025 scope.go:117] "RemoveContainer" containerID="76ba63d82ad08b0153c89278b305db830f608f9bed7c534bb409b72c53c3a06c" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.477436 5025 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd6889a88-68ef-4bf3-9e7e-78c6d84785ae"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd6889a88-68ef-4bf3-9e7e-78c6d84785ae] : Timed out while waiting for systemd to remove kubepods-besteffort-podd6889a88_68ef_4bf3_9e7e_78c6d84785ae.slice" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.481874 5025 scope.go:117] "RemoveContainer" containerID="266ae7663ff2f3c8ca4fdece1db841dc278426b4fca4509d0d0233a384d325dd" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.501083 5025 scope.go:117] "RemoveContainer" containerID="49443cb71b40bc6abe49e9f54690d77bd06b9d3970bc17a852e8a4e30f9ba6b7" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.521843 5025 scope.go:117] "RemoveContainer" containerID="c93808ebcf734eeb951e224e6dd45872e982d63f46ec041bfc4d33193be2b705" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.539154 5025 scope.go:117] "RemoveContainer" containerID="15d39914545a4a069a9ea2c3664be41c34cbfd52c1e69df82087b4f1a1940cdd" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.555933 5025 scope.go:117] "RemoveContainer" containerID="8c0c209e0771ee6bccd63b38c1d1aaf2d92866662d731f3cc5eccf0667f3306a" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.573549 5025 scope.go:117] "RemoveContainer" containerID="9640e93cbab8bc8d914777065ae812b65e462076e84f96dabb2f52c7a65f973a" Oct 07 08:38:46 crc kubenswrapper[5025]: I1007 08:38:46.590940 5025 scope.go:117] "RemoveContainer" containerID="4fd99f99ec9c844da88a9f393a6696daa800bfa88f16e03ae533ceadefd9d877" Oct 07 08:38:47 crc kubenswrapper[5025]: I1007 08:38:47.930506 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" path="/var/lib/kubelet/pods/9d7aa421-d9db-4ba6-882a-432d2e8b840f/volumes" Oct 07 08:38:52 crc kubenswrapper[5025]: I1007 08:38:52.119923 5025 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poda4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poda4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef] : Timed out while waiting for systemd to remove kubepods-besteffort-poda4d92a87_d0f4_40c4_a370_1bf0fb8fe5ef.slice" Oct 07 08:38:52 crc kubenswrapper[5025]: E1007 08:38:52.120376 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort poda4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef] : unable to destroy cgroup paths for cgroup [kubepods besteffort poda4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef] : Timed out while waiting for systemd to remove kubepods-besteffort-poda4d92a87_d0f4_40c4_a370_1bf0fb8fe5ef.slice" pod="openstack/keystone-6f574885d6-269v8" podUID="a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" Oct 07 08:38:52 crc kubenswrapper[5025]: I1007 08:38:52.287416 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f574885d6-269v8" Oct 07 08:38:52 crc kubenswrapper[5025]: I1007 08:38:52.322371 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6f574885d6-269v8"] Oct 07 08:38:52 crc kubenswrapper[5025]: I1007 08:38:52.328749 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6f574885d6-269v8"] Oct 07 08:38:53 crc kubenswrapper[5025]: I1007 08:38:53.929005 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" path="/var/lib/kubelet/pods/a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef/volumes" Oct 07 08:39:55 crc kubenswrapper[5025]: I1007 08:39:55.934633 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:39:55 crc kubenswrapper[5025]: I1007 08:39:55.935191 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:40:25 crc kubenswrapper[5025]: I1007 08:40:25.934798 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:40:25 crc kubenswrapper[5025]: I1007 08:40:25.935967 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.543065 5025 scope.go:117] "RemoveContainer" containerID="e24b883eadd9e8460cb557c6932fec97c79242558a35c3b28613b86a5ee5e691" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.569576 5025 scope.go:117] "RemoveContainer" containerID="abeeead2c75322aa86d4cafa6fe54a54de8867bb4c28dfa4b2ad7f6eba21afe7" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.615167 5025 scope.go:117] "RemoveContainer" containerID="8cc6b65c04574b0ccb31f07b887838e143249f7a2e3d197999fac731f4f5cdf3" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.656135 5025 scope.go:117] "RemoveContainer" containerID="35487aef47de0def75e3382a7a0b3d3512d8dc866a0cacf75aef7c41baabaca9" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.672698 5025 scope.go:117] "RemoveContainer" containerID="4ae8406675d0d507373d46e28498fc2f978edf6ad339490e4cd0940feb93a36a" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.710194 5025 scope.go:117] "RemoveContainer" containerID="cf4deced3355c59cc056a03b7fc99482fa8455897aeebbd5372e6c7fe67ff136" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.744777 5025 scope.go:117] "RemoveContainer" containerID="fbf155f64e72192c229eb5829876b2b2a52e4defa95588a3abec0cd6525eb31d" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.771990 5025 scope.go:117] "RemoveContainer" containerID="bba8208e2d1a5187922ed405a4067ff412a5ef5ece681e295db2104ffdac07b4" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.787147 5025 scope.go:117] "RemoveContainer" containerID="ba6d5fae88b902587605c5a5955ad73f4e04cfb1fb6234c0f017dfcfe8efd280" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.804905 5025 scope.go:117] "RemoveContainer" containerID="503c88cbfff97597d46cee5bfce8f453317fb9225c7e0c1875ee907541a00991" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.824324 5025 scope.go:117] "RemoveContainer" containerID="f5f91a3a15b66c0647f9504a10bc37bb459966180b4d1e14f9fb21cb10d63d80" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.859321 5025 scope.go:117] "RemoveContainer" containerID="19747fc3a062a51335fa4145a9a98d5ced6ebfa63f05ed7f31be5e02d828f525" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.883351 5025 scope.go:117] "RemoveContainer" containerID="623ad4cfbe31a78123038509fd96a2cd137909f0a4c517859ab31753d48f5f79" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.909975 5025 scope.go:117] "RemoveContainer" containerID="7ab426fa5e4d1132168e5853d0fb94d09b71ae76e12935bbb94a56b4cfdc5f2b" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.946187 5025 scope.go:117] "RemoveContainer" containerID="6562b6087c3ad21549c75a1f9f5c20930c4a9243c932457cb82207595e4442e2" Oct 07 08:40:36 crc kubenswrapper[5025]: I1007 08:40:36.977377 5025 scope.go:117] "RemoveContainer" containerID="843d4136d3328eeac05dabc41a4cb8468846a3512ab6b8c6f9ddc40fb150090d" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.265355 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9phkk"] Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266074 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28643c48-ee4c-4880-8901-9e3231c70b70" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266089 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="28643c48-ee4c-4880-8901-9e3231c70b70" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266112 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87571c0-a1b9-4187-a60c-dd66d9fbb301" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266119 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87571c0-a1b9-4187-a60c-dd66d9fbb301" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266134 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582efaa6-0e81-462a-813f-7ba1cf9c6fc2" containerName="galera" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266142 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="582efaa6-0e81-462a-813f-7ba1cf9c6fc2" containerName="galera" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266155 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-updater" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266162 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-updater" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266179 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerName="glance-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266186 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerName="glance-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266198 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-auditor" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266205 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-auditor" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266218 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-replicator" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266225 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-replicator" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266236 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerName="glance-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266243 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerName="glance-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266256 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-server" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266264 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-server" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266274 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-replicator" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266282 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-replicator" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266295 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" containerName="mysql-bootstrap" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266303 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" containerName="mysql-bootstrap" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266315 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="041675fd-b2c1-4e9c-9b05-4a2aef6d329f" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266323 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="041675fd-b2c1-4e9c-9b05-4a2aef6d329f" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266332 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-replicator" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266339 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-replicator" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266352 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-expirer" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266361 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-expirer" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266372 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" containerName="keystone-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266378 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" containerName="keystone-api" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266390 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" containerName="galera" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266396 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" containerName="galera" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266411 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerName="barbican-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266418 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerName="barbican-api" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266427 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerName="cinder-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266434 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerName="cinder-api" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266441 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server-init" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266449 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server-init" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266464 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-server" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266470 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-server" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266484 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-reaper" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266491 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-reaper" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266503 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f36c18-61cd-43d1-98a6-b569197c9382" containerName="barbican-keystone-listener" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266510 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f36c18-61cd-43d1-98a6-b569197c9382" containerName="barbican-keystone-listener" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266523 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266530 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-log" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266563 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-server" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266570 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-server" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266581 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582efaa6-0e81-462a-813f-7ba1cf9c6fc2" containerName="mysql-bootstrap" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266589 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="582efaa6-0e81-462a-813f-7ba1cf9c6fc2" containerName="mysql-bootstrap" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266597 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="rsync" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266603 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="rsync" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266616 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerName="cinder-api-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266623 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerName="cinder-api-log" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266631 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8691f972-205f-470b-b300-40c32106704b" containerName="nova-scheduler-scheduler" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266638 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8691f972-205f-470b-b300-40c32106704b" containerName="nova-scheduler-scheduler" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266649 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="swift-recon-cron" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266656 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="swift-recon-cron" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266669 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="ceilometer-notification-agent" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266675 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="ceilometer-notification-agent" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266683 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-metadata" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266690 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-metadata" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266697 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073c6d0a-3fea-4e0b-8f8f-f58613b4bf23" containerName="nova-cell0-conductor-conductor" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266705 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="073c6d0a-3fea-4e0b-8f8f-f58613b4bf23" containerName="nova-cell0-conductor-conductor" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266718 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerName="glance-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266725 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerName="glance-log" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266738 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b956a819-1c8f-471c-a738-fe4d78fbbb98" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266745 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="b956a819-1c8f-471c-a738-fe4d78fbbb98" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266758 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266764 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266773 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d1598f-6725-40cd-99bd-5ee1bf699225" containerName="memcached" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266780 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d1598f-6725-40cd-99bd-5ee1bf699225" containerName="memcached" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266788 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" containerName="rabbitmq" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266795 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" containerName="rabbitmq" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266807 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-updater" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266813 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-updater" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266823 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerName="ovn-northd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266830 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerName="ovn-northd" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266843 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9740b484-8952-4253-9158-17164236ffcc" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266852 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9740b484-8952-4253-9158-17164236ffcc" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266860 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f36c18-61cd-43d1-98a6-b569197c9382" containerName="barbican-keystone-listener-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266868 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f36c18-61cd-43d1-98a6-b569197c9382" containerName="barbican-keystone-listener-log" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266878 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerName="glance-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266884 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerName="glance-log" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266892 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerName="openstack-network-exporter" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266900 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerName="openstack-network-exporter" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266911 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46577dd-b38b-4b80-ad57-577629e648b8" containerName="rabbitmq" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266918 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46577dd-b38b-4b80-ad57-577629e648b8" containerName="rabbitmq" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266926 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266934 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266946 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerName="barbican-worker" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266953 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerName="barbican-worker" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266964 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46577dd-b38b-4b80-ad57-577629e648b8" containerName="setup-container" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266970 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46577dd-b38b-4b80-ad57-577629e648b8" containerName="setup-container" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.266983 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerName="placement-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.266989 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerName="placement-log" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267001 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="sg-core" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267008 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="sg-core" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267018 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerName="barbican-api-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267024 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerName="barbican-api-log" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267047 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerName="barbican-worker-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267055 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerName="barbican-worker-log" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267065 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267072 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-api" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267079 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267086 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-log" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267095 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" containerName="setup-container" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267101 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" containerName="setup-container" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267114 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerName="neutron-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267121 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerName="neutron-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267131 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732b0bbc-f02e-4ced-9230-af95c9654b93" containerName="kube-state-metrics" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267138 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="732b0bbc-f02e-4ced-9230-af95c9654b93" containerName="kube-state-metrics" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267148 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerName="placement-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267155 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerName="placement-api" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267163 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerName="neutron-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267169 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerName="neutron-api" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267180 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="ceilometer-central-agent" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267187 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="ceilometer-central-agent" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267199 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-auditor" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267206 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-auditor" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267218 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-auditor" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267225 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-auditor" Oct 07 08:40:40 crc kubenswrapper[5025]: E1007 08:40:40.267237 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="proxy-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267244 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="proxy-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267408 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="sg-core" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267421 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-metadata" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267435 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46577dd-b38b-4b80-ad57-577629e648b8" containerName="rabbitmq" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267443 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267451 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9740b484-8952-4253-9158-17164236ffcc" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267462 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerName="barbican-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267473 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="073c6d0a-3fea-4e0b-8f8f-f58613b4bf23" containerName="nova-cell0-conductor-conductor" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267482 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerName="glance-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267493 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a4ae5a-0b64-481f-a54d-2263de0eae8e" containerName="rabbitmq" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267505 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-auditor" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267519 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-auditor" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267531 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d92a87-d0f4-40c4-a370-1bf0fb8fe5ef" containerName="keystone-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267558 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-expirer" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267567 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-auditor" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267579 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f36c18-61cd-43d1-98a6-b569197c9382" containerName="barbican-keystone-listener" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267590 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d1598f-6725-40cd-99bd-5ee1bf699225" containerName="memcached" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267602 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-server" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267610 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-replicator" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267623 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-updater" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267631 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="041675fd-b2c1-4e9c-9b05-4a2aef6d329f" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267643 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-replicator" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267654 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="rsync" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267662 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerName="placement-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267673 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="ceilometer-central-agent" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267682 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="b956a819-1c8f-471c-a738-fe4d78fbbb98" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267692 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8691f972-205f-470b-b300-40c32106704b" containerName="nova-scheduler-scheduler" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267700 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="ceilometer-notification-agent" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267712 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovsdb-server" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267723 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerName="openstack-network-exporter" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267734 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerName="glance-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267742 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerName="neutron-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267754 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09e8173-e1c3-4a48-8bc6-e9205b1dd6d8" containerName="placement-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267763 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a892055-924e-4e4f-a625-b15b8a39c4bf" containerName="nova-metadata-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267772 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87571c0-a1b9-4187-a60c-dd66d9fbb301" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267781 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-replicator" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267792 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d5c1b3-b23a-4ee4-ac74-fd5c1c075a9b" containerName="ovs-vswitchd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267802 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="988f0f87-7a01-4aac-a874-8e885b2f83cb" containerName="glance-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267815 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a63b89-4f79-41a7-9e9a-e3c464d015a4" containerName="nova-api-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267825 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerName="barbican-worker-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267833 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd481497-9604-41b6-918a-a8ec9fd6ad92" containerName="ovn-northd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267841 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="swift-recon-cron" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267852 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-reaper" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267865 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="container-updater" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267880 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0f744b-7d32-4337-be25-f4f6326aaec2" containerName="proxy-httpd" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267890 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6c6b9c-b8b4-443c-a1d9-d829b8f8e9c0" containerName="neutron-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267899 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="732b0bbc-f02e-4ced-9230-af95c9654b93" containerName="kube-state-metrics" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267906 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="28643c48-ee4c-4880-8901-9e3231c70b70" containerName="mariadb-account-delete" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267918 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="007c50ab-22f3-4b11-a7e9-3890acbe7e03" containerName="galera" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267930 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerName="cinder-api" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267941 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8356fc-f4bd-4853-a3f3-0d44ab20612b" containerName="barbican-worker" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267951 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a4256e-db59-4623-aefb-bad0c1412bf1" containerName="glance-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267960 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="582efaa6-0e81-462a-813f-7ba1cf9c6fc2" containerName="galera" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267970 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="object-server" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267981 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f36c18-61cd-43d1-98a6-b569197c9382" containerName="barbican-keystone-listener-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267988 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6c7af8-d4b3-4f6a-aaa8-775c843e9a78" containerName="cinder-api-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.267997 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66554ef-2742-4c2b-a6e4-58b8377f03fa" containerName="barbican-api-log" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.268005 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aa421-d9db-4ba6-882a-432d2e8b840f" containerName="account-server" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.269267 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.271731 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9phkk"] Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.399420 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-catalog-content\") pod \"certified-operators-9phkk\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.399464 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/1a255cce-0a0e-44dd-a23f-90550dfce7ab-kube-api-access-xd9gz\") pod \"certified-operators-9phkk\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.399560 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-utilities\") pod \"certified-operators-9phkk\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.500375 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-utilities\") pod \"certified-operators-9phkk\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.500482 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-catalog-content\") pod \"certified-operators-9phkk\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.500555 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/1a255cce-0a0e-44dd-a23f-90550dfce7ab-kube-api-access-xd9gz\") pod \"certified-operators-9phkk\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.500876 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-utilities\") pod \"certified-operators-9phkk\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.501014 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-catalog-content\") pod \"certified-operators-9phkk\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.526665 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/1a255cce-0a0e-44dd-a23f-90550dfce7ab-kube-api-access-xd9gz\") pod \"certified-operators-9phkk\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:40 crc kubenswrapper[5025]: I1007 08:40:40.593011 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:41 crc kubenswrapper[5025]: I1007 08:40:41.082749 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9phkk"] Oct 07 08:40:41 crc kubenswrapper[5025]: W1007 08:40:41.094456 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a255cce_0a0e_44dd_a23f_90550dfce7ab.slice/crio-05f33d399ec6763b26cacbb6a0087bc70c9aa6444e2ef8173fce783db679387e WatchSource:0}: Error finding container 05f33d399ec6763b26cacbb6a0087bc70c9aa6444e2ef8173fce783db679387e: Status 404 returned error can't find the container with id 05f33d399ec6763b26cacbb6a0087bc70c9aa6444e2ef8173fce783db679387e Oct 07 08:40:41 crc kubenswrapper[5025]: I1007 08:40:41.294647 5025 generic.go:334] "Generic (PLEG): container finished" podID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerID="228821ec0681da14dae65968a3dadda92d5674bce838e0b0942a6dc0af85f343" exitCode=0 Oct 07 08:40:41 crc kubenswrapper[5025]: I1007 08:40:41.294824 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9phkk" event={"ID":"1a255cce-0a0e-44dd-a23f-90550dfce7ab","Type":"ContainerDied","Data":"228821ec0681da14dae65968a3dadda92d5674bce838e0b0942a6dc0af85f343"} Oct 07 08:40:41 crc kubenswrapper[5025]: I1007 08:40:41.295057 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9phkk" event={"ID":"1a255cce-0a0e-44dd-a23f-90550dfce7ab","Type":"ContainerStarted","Data":"05f33d399ec6763b26cacbb6a0087bc70c9aa6444e2ef8173fce783db679387e"} Oct 07 08:40:41 crc kubenswrapper[5025]: I1007 08:40:41.298250 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 08:40:42 crc kubenswrapper[5025]: I1007 08:40:42.306874 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9phkk" event={"ID":"1a255cce-0a0e-44dd-a23f-90550dfce7ab","Type":"ContainerStarted","Data":"b3cebeadac08ee6f2f6c8d91201e86989b618610cf0af1e5aa7c7cf332b20124"} Oct 07 08:40:43 crc kubenswrapper[5025]: I1007 08:40:43.317862 5025 generic.go:334] "Generic (PLEG): container finished" podID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerID="b3cebeadac08ee6f2f6c8d91201e86989b618610cf0af1e5aa7c7cf332b20124" exitCode=0 Oct 07 08:40:43 crc kubenswrapper[5025]: I1007 08:40:43.317914 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9phkk" event={"ID":"1a255cce-0a0e-44dd-a23f-90550dfce7ab","Type":"ContainerDied","Data":"b3cebeadac08ee6f2f6c8d91201e86989b618610cf0af1e5aa7c7cf332b20124"} Oct 07 08:40:45 crc kubenswrapper[5025]: I1007 08:40:45.342146 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9phkk" event={"ID":"1a255cce-0a0e-44dd-a23f-90550dfce7ab","Type":"ContainerStarted","Data":"bf5dc405b9eac171f0a8d9f5fa8eebd52d697bb48aeabca069d43ac12e465701"} Oct 07 08:40:45 crc kubenswrapper[5025]: I1007 08:40:45.377772 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9phkk" podStartSLOduration=1.532353481 podStartE2EDuration="5.377739393s" podCreationTimestamp="2025-10-07 08:40:40 +0000 UTC" firstStartedPulling="2025-10-07 08:40:41.297978686 +0000 UTC m=+1448.107292830" lastFinishedPulling="2025-10-07 08:40:45.143364558 +0000 UTC m=+1451.952678742" observedRunningTime="2025-10-07 08:40:45.375398459 +0000 UTC m=+1452.184712663" watchObservedRunningTime="2025-10-07 08:40:45.377739393 +0000 UTC m=+1452.187053537" Oct 07 08:40:50 crc kubenswrapper[5025]: I1007 08:40:50.593358 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:50 crc kubenswrapper[5025]: I1007 08:40:50.593962 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:50 crc kubenswrapper[5025]: I1007 08:40:50.644083 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:51 crc kubenswrapper[5025]: I1007 08:40:51.456552 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:51 crc kubenswrapper[5025]: I1007 08:40:51.510943 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9phkk"] Oct 07 08:40:53 crc kubenswrapper[5025]: I1007 08:40:53.415778 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9phkk" podUID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerName="registry-server" containerID="cri-o://bf5dc405b9eac171f0a8d9f5fa8eebd52d697bb48aeabca069d43ac12e465701" gracePeriod=2 Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.424238 5025 generic.go:334] "Generic (PLEG): container finished" podID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerID="bf5dc405b9eac171f0a8d9f5fa8eebd52d697bb48aeabca069d43ac12e465701" exitCode=0 Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.424314 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9phkk" event={"ID":"1a255cce-0a0e-44dd-a23f-90550dfce7ab","Type":"ContainerDied","Data":"bf5dc405b9eac171f0a8d9f5fa8eebd52d697bb48aeabca069d43ac12e465701"} Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.424653 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9phkk" event={"ID":"1a255cce-0a0e-44dd-a23f-90550dfce7ab","Type":"ContainerDied","Data":"05f33d399ec6763b26cacbb6a0087bc70c9aa6444e2ef8173fce783db679387e"} Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.424669 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f33d399ec6763b26cacbb6a0087bc70c9aa6444e2ef8173fce783db679387e" Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.439178 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.611270 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-catalog-content\") pod \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.611354 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-utilities\") pod \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.611432 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/1a255cce-0a0e-44dd-a23f-90550dfce7ab-kube-api-access-xd9gz\") pod \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\" (UID: \"1a255cce-0a0e-44dd-a23f-90550dfce7ab\") " Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.612292 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-utilities" (OuterVolumeSpecName: "utilities") pod "1a255cce-0a0e-44dd-a23f-90550dfce7ab" (UID: "1a255cce-0a0e-44dd-a23f-90550dfce7ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.617481 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a255cce-0a0e-44dd-a23f-90550dfce7ab-kube-api-access-xd9gz" (OuterVolumeSpecName: "kube-api-access-xd9gz") pod "1a255cce-0a0e-44dd-a23f-90550dfce7ab" (UID: "1a255cce-0a0e-44dd-a23f-90550dfce7ab"). InnerVolumeSpecName "kube-api-access-xd9gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.656719 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a255cce-0a0e-44dd-a23f-90550dfce7ab" (UID: "1a255cce-0a0e-44dd-a23f-90550dfce7ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.712698 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/1a255cce-0a0e-44dd-a23f-90550dfce7ab-kube-api-access-xd9gz\") on node \"crc\" DevicePath \"\"" Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.712723 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:40:54 crc kubenswrapper[5025]: I1007 08:40:54.712732 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a255cce-0a0e-44dd-a23f-90550dfce7ab-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:40:55 crc kubenswrapper[5025]: I1007 08:40:55.431222 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9phkk" Oct 07 08:40:55 crc kubenswrapper[5025]: I1007 08:40:55.463618 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9phkk"] Oct 07 08:40:55 crc kubenswrapper[5025]: I1007 08:40:55.470139 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9phkk"] Oct 07 08:40:55 crc kubenswrapper[5025]: I1007 08:40:55.929978 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" path="/var/lib/kubelet/pods/1a255cce-0a0e-44dd-a23f-90550dfce7ab/volumes" Oct 07 08:40:55 crc kubenswrapper[5025]: I1007 08:40:55.933946 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:40:55 crc kubenswrapper[5025]: I1007 08:40:55.934013 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:40:55 crc kubenswrapper[5025]: I1007 08:40:55.934057 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:40:55 crc kubenswrapper[5025]: I1007 08:40:55.934759 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13140e5c73fc6f443f1a7e1d646b5e6ed1f83f36c3c46081a22a0e60d0f1c23e"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:40:55 crc kubenswrapper[5025]: I1007 08:40:55.934819 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://13140e5c73fc6f443f1a7e1d646b5e6ed1f83f36c3c46081a22a0e60d0f1c23e" gracePeriod=600 Oct 07 08:40:56 crc kubenswrapper[5025]: I1007 08:40:56.467899 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="13140e5c73fc6f443f1a7e1d646b5e6ed1f83f36c3c46081a22a0e60d0f1c23e" exitCode=0 Oct 07 08:40:56 crc kubenswrapper[5025]: I1007 08:40:56.468784 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"13140e5c73fc6f443f1a7e1d646b5e6ed1f83f36c3c46081a22a0e60d0f1c23e"} Oct 07 08:40:56 crc kubenswrapper[5025]: I1007 08:40:56.475312 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9"} Oct 07 08:40:56 crc kubenswrapper[5025]: I1007 08:40:56.475356 5025 scope.go:117] "RemoveContainer" containerID="1b24ca81daef9a1bfdd6aba62c02272733ea9a1c7e3d4e5bdd90b9e74defc3eb" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.190508 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xlm8g"] Oct 07 08:41:18 crc kubenswrapper[5025]: E1007 08:41:18.192380 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerName="extract-content" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.192460 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerName="extract-content" Oct 07 08:41:18 crc kubenswrapper[5025]: E1007 08:41:18.192612 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerName="extract-utilities" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.192634 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerName="extract-utilities" Oct 07 08:41:18 crc kubenswrapper[5025]: E1007 08:41:18.192737 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerName="registry-server" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.192810 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerName="registry-server" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.193380 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a255cce-0a0e-44dd-a23f-90550dfce7ab" containerName="registry-server" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.196066 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.200708 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlm8g"] Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.261073 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-utilities\") pod \"redhat-marketplace-xlm8g\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.261190 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmwsj\" (UniqueName: \"kubernetes.io/projected/44c1fddf-320b-403f-8602-a7a93e5053d4-kube-api-access-fmwsj\") pod \"redhat-marketplace-xlm8g\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.261260 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-catalog-content\") pod \"redhat-marketplace-xlm8g\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.362425 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmwsj\" (UniqueName: \"kubernetes.io/projected/44c1fddf-320b-403f-8602-a7a93e5053d4-kube-api-access-fmwsj\") pod \"redhat-marketplace-xlm8g\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.362499 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-catalog-content\") pod \"redhat-marketplace-xlm8g\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.362980 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-catalog-content\") pod \"redhat-marketplace-xlm8g\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.363099 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-utilities\") pod \"redhat-marketplace-xlm8g\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.363352 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-utilities\") pod \"redhat-marketplace-xlm8g\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.393117 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmwsj\" (UniqueName: \"kubernetes.io/projected/44c1fddf-320b-403f-8602-a7a93e5053d4-kube-api-access-fmwsj\") pod \"redhat-marketplace-xlm8g\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:18 crc kubenswrapper[5025]: I1007 08:41:18.556463 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:19 crc kubenswrapper[5025]: I1007 08:41:19.042061 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlm8g"] Oct 07 08:41:19 crc kubenswrapper[5025]: W1007 08:41:19.079795 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c1fddf_320b_403f_8602_a7a93e5053d4.slice/crio-f714ec47afb9ff850a99389f3cc24d89e6632dfe0a6ac3afdba90caa2f7a6003 WatchSource:0}: Error finding container f714ec47afb9ff850a99389f3cc24d89e6632dfe0a6ac3afdba90caa2f7a6003: Status 404 returned error can't find the container with id f714ec47afb9ff850a99389f3cc24d89e6632dfe0a6ac3afdba90caa2f7a6003 Oct 07 08:41:19 crc kubenswrapper[5025]: I1007 08:41:19.689242 5025 generic.go:334] "Generic (PLEG): container finished" podID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerID="3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b" exitCode=0 Oct 07 08:41:19 crc kubenswrapper[5025]: I1007 08:41:19.689295 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlm8g" event={"ID":"44c1fddf-320b-403f-8602-a7a93e5053d4","Type":"ContainerDied","Data":"3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b"} Oct 07 08:41:19 crc kubenswrapper[5025]: I1007 08:41:19.689649 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlm8g" event={"ID":"44c1fddf-320b-403f-8602-a7a93e5053d4","Type":"ContainerStarted","Data":"f714ec47afb9ff850a99389f3cc24d89e6632dfe0a6ac3afdba90caa2f7a6003"} Oct 07 08:41:21 crc kubenswrapper[5025]: I1007 08:41:21.706197 5025 generic.go:334] "Generic (PLEG): container finished" podID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerID="5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d" exitCode=0 Oct 07 08:41:21 crc kubenswrapper[5025]: I1007 08:41:21.706244 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlm8g" event={"ID":"44c1fddf-320b-403f-8602-a7a93e5053d4","Type":"ContainerDied","Data":"5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d"} Oct 07 08:41:22 crc kubenswrapper[5025]: I1007 08:41:22.717565 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlm8g" event={"ID":"44c1fddf-320b-403f-8602-a7a93e5053d4","Type":"ContainerStarted","Data":"82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05"} Oct 07 08:41:22 crc kubenswrapper[5025]: I1007 08:41:22.738125 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xlm8g" podStartSLOduration=1.958356333 podStartE2EDuration="4.738100671s" podCreationTimestamp="2025-10-07 08:41:18 +0000 UTC" firstStartedPulling="2025-10-07 08:41:19.691555295 +0000 UTC m=+1486.500869459" lastFinishedPulling="2025-10-07 08:41:22.471299633 +0000 UTC m=+1489.280613797" observedRunningTime="2025-10-07 08:41:22.732589258 +0000 UTC m=+1489.541903402" watchObservedRunningTime="2025-10-07 08:41:22.738100671 +0000 UTC m=+1489.547414855" Oct 07 08:41:28 crc kubenswrapper[5025]: I1007 08:41:28.557625 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:28 crc kubenswrapper[5025]: I1007 08:41:28.558252 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:28 crc kubenswrapper[5025]: I1007 08:41:28.609841 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:28 crc kubenswrapper[5025]: I1007 08:41:28.810023 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:28 crc kubenswrapper[5025]: I1007 08:41:28.854974 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlm8g"] Oct 07 08:41:30 crc kubenswrapper[5025]: I1007 08:41:30.786340 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xlm8g" podUID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerName="registry-server" containerID="cri-o://82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05" gracePeriod=2 Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.213167 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.345717 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-catalog-content\") pod \"44c1fddf-320b-403f-8602-a7a93e5053d4\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.345774 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-utilities\") pod \"44c1fddf-320b-403f-8602-a7a93e5053d4\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.345875 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmwsj\" (UniqueName: \"kubernetes.io/projected/44c1fddf-320b-403f-8602-a7a93e5053d4-kube-api-access-fmwsj\") pod \"44c1fddf-320b-403f-8602-a7a93e5053d4\" (UID: \"44c1fddf-320b-403f-8602-a7a93e5053d4\") " Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.346916 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-utilities" (OuterVolumeSpecName: "utilities") pod "44c1fddf-320b-403f-8602-a7a93e5053d4" (UID: "44c1fddf-320b-403f-8602-a7a93e5053d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.351689 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c1fddf-320b-403f-8602-a7a93e5053d4-kube-api-access-fmwsj" (OuterVolumeSpecName: "kube-api-access-fmwsj") pod "44c1fddf-320b-403f-8602-a7a93e5053d4" (UID: "44c1fddf-320b-403f-8602-a7a93e5053d4"). InnerVolumeSpecName "kube-api-access-fmwsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.362277 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44c1fddf-320b-403f-8602-a7a93e5053d4" (UID: "44c1fddf-320b-403f-8602-a7a93e5053d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.447781 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.448406 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c1fddf-320b-403f-8602-a7a93e5053d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.448439 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmwsj\" (UniqueName: \"kubernetes.io/projected/44c1fddf-320b-403f-8602-a7a93e5053d4-kube-api-access-fmwsj\") on node \"crc\" DevicePath \"\"" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.795953 5025 generic.go:334] "Generic (PLEG): container finished" podID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerID="82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05" exitCode=0 Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.796001 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlm8g" event={"ID":"44c1fddf-320b-403f-8602-a7a93e5053d4","Type":"ContainerDied","Data":"82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05"} Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.796038 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlm8g" event={"ID":"44c1fddf-320b-403f-8602-a7a93e5053d4","Type":"ContainerDied","Data":"f714ec47afb9ff850a99389f3cc24d89e6632dfe0a6ac3afdba90caa2f7a6003"} Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.796058 5025 scope.go:117] "RemoveContainer" containerID="82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.796052 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlm8g" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.834219 5025 scope.go:117] "RemoveContainer" containerID="5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.836027 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlm8g"] Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.840101 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlm8g"] Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.853841 5025 scope.go:117] "RemoveContainer" containerID="3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.876696 5025 scope.go:117] "RemoveContainer" containerID="82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05" Oct 07 08:41:31 crc kubenswrapper[5025]: E1007 08:41:31.877909 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05\": container with ID starting with 82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05 not found: ID does not exist" containerID="82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.877958 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05"} err="failed to get container status \"82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05\": rpc error: code = NotFound desc = could not find container \"82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05\": container with ID starting with 82cd775f6472d689101404ddb6baa2e70ffd17dbd6142c11ba54a606f5d6ad05 not found: ID does not exist" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.878013 5025 scope.go:117] "RemoveContainer" containerID="5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d" Oct 07 08:41:31 crc kubenswrapper[5025]: E1007 08:41:31.878749 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d\": container with ID starting with 5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d not found: ID does not exist" containerID="5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.878775 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d"} err="failed to get container status \"5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d\": rpc error: code = NotFound desc = could not find container \"5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d\": container with ID starting with 5476ee525e2e7a4c983d1ba9dff6285ea392364f656ad1e3fadd6d37c8e01e6d not found: ID does not exist" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.878791 5025 scope.go:117] "RemoveContainer" containerID="3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b" Oct 07 08:41:31 crc kubenswrapper[5025]: E1007 08:41:31.879383 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b\": container with ID starting with 3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b not found: ID does not exist" containerID="3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.879438 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b"} err="failed to get container status \"3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b\": rpc error: code = NotFound desc = could not find container \"3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b\": container with ID starting with 3ecbfc7a0dc54f61cc1687c4c756e60bede81db197723d321fc88fd6043d5f4b not found: ID does not exist" Oct 07 08:41:31 crc kubenswrapper[5025]: I1007 08:41:31.929356 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c1fddf-320b-403f-8602-a7a93e5053d4" path="/var/lib/kubelet/pods/44c1fddf-320b-403f-8602-a7a93e5053d4/volumes" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.487311 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qxkcv"] Oct 07 08:41:34 crc kubenswrapper[5025]: E1007 08:41:34.496610 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerName="extract-content" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.496713 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerName="extract-content" Oct 07 08:41:34 crc kubenswrapper[5025]: E1007 08:41:34.496816 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerName="extract-utilities" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.496888 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerName="extract-utilities" Oct 07 08:41:34 crc kubenswrapper[5025]: E1007 08:41:34.496977 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerName="registry-server" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.497058 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerName="registry-server" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.497324 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c1fddf-320b-403f-8602-a7a93e5053d4" containerName="registry-server" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.498652 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.503034 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxkcv"] Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.587987 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhgqz\" (UniqueName: \"kubernetes.io/projected/7a7e2ed0-632f-412b-9b9a-50ca4925b881-kube-api-access-vhgqz\") pod \"community-operators-qxkcv\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.588230 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-utilities\") pod \"community-operators-qxkcv\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.588298 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-catalog-content\") pod \"community-operators-qxkcv\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.690094 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-utilities\") pod \"community-operators-qxkcv\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.690165 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-catalog-content\") pod \"community-operators-qxkcv\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.690230 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhgqz\" (UniqueName: \"kubernetes.io/projected/7a7e2ed0-632f-412b-9b9a-50ca4925b881-kube-api-access-vhgqz\") pod \"community-operators-qxkcv\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.690783 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-catalog-content\") pod \"community-operators-qxkcv\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.691122 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-utilities\") pod \"community-operators-qxkcv\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.718691 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhgqz\" (UniqueName: \"kubernetes.io/projected/7a7e2ed0-632f-412b-9b9a-50ca4925b881-kube-api-access-vhgqz\") pod \"community-operators-qxkcv\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:34 crc kubenswrapper[5025]: I1007 08:41:34.818600 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:35 crc kubenswrapper[5025]: I1007 08:41:35.296290 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxkcv"] Oct 07 08:41:35 crc kubenswrapper[5025]: I1007 08:41:35.831194 5025 generic.go:334] "Generic (PLEG): container finished" podID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerID="afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683" exitCode=0 Oct 07 08:41:35 crc kubenswrapper[5025]: I1007 08:41:35.831326 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkcv" event={"ID":"7a7e2ed0-632f-412b-9b9a-50ca4925b881","Type":"ContainerDied","Data":"afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683"} Oct 07 08:41:35 crc kubenswrapper[5025]: I1007 08:41:35.831440 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkcv" event={"ID":"7a7e2ed0-632f-412b-9b9a-50ca4925b881","Type":"ContainerStarted","Data":"b9a2ac96dc329e1dc60f4fc3bddd02d898268cfacd7314f2d3d5d073ca9d83bd"} Oct 07 08:41:36 crc kubenswrapper[5025]: I1007 08:41:36.841457 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkcv" event={"ID":"7a7e2ed0-632f-412b-9b9a-50ca4925b881","Type":"ContainerStarted","Data":"9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168"} Oct 07 08:41:37 crc kubenswrapper[5025]: I1007 08:41:37.244975 5025 scope.go:117] "RemoveContainer" containerID="562e7d2d588d95b2f6a1c903ed0b4db2a9e3f4bd3a63c1bfff1f21cc4e9ec999" Oct 07 08:41:37 crc kubenswrapper[5025]: I1007 08:41:37.274848 5025 scope.go:117] "RemoveContainer" containerID="c660c36e2bb2d6b75ff1117399aae70f2a2e059a387005f729f8ffcb0f430141" Oct 07 08:41:37 crc kubenswrapper[5025]: I1007 08:41:37.319878 5025 scope.go:117] "RemoveContainer" containerID="fcc59fa4ee6efb3343e06efb0358177c91fcb632bb8cbd5cc15a1c3116b88117" Oct 07 08:41:37 crc kubenswrapper[5025]: I1007 08:41:37.357747 5025 scope.go:117] "RemoveContainer" containerID="be9144fe694a0b08748194b5643337f23d2434df55bb180b2c8ade9c750a985a" Oct 07 08:41:37 crc kubenswrapper[5025]: I1007 08:41:37.383829 5025 scope.go:117] "RemoveContainer" containerID="1ace7a9d2660c1d0a09f90632c3c4b67d4bfd75ba7e66b931bc3aae8d78e1fac" Oct 07 08:41:37 crc kubenswrapper[5025]: I1007 08:41:37.398806 5025 scope.go:117] "RemoveContainer" containerID="44882153253c9e88843124cf0fa7157412b7cdee9616d43b7d45e44435fde32a" Oct 07 08:41:37 crc kubenswrapper[5025]: I1007 08:41:37.425903 5025 scope.go:117] "RemoveContainer" containerID="68d1e4766470c867bae53bb3d98998e23095f3e94bb406fab13104d2f4ce5885" Oct 07 08:41:37 crc kubenswrapper[5025]: I1007 08:41:37.851778 5025 generic.go:334] "Generic (PLEG): container finished" podID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerID="9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168" exitCode=0 Oct 07 08:41:37 crc kubenswrapper[5025]: I1007 08:41:37.851819 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkcv" event={"ID":"7a7e2ed0-632f-412b-9b9a-50ca4925b881","Type":"ContainerDied","Data":"9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168"} Oct 07 08:41:38 crc kubenswrapper[5025]: I1007 08:41:38.862481 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkcv" event={"ID":"7a7e2ed0-632f-412b-9b9a-50ca4925b881","Type":"ContainerStarted","Data":"5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d"} Oct 07 08:41:38 crc kubenswrapper[5025]: I1007 08:41:38.882570 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qxkcv" podStartSLOduration=2.197063236 podStartE2EDuration="4.882533559s" podCreationTimestamp="2025-10-07 08:41:34 +0000 UTC" firstStartedPulling="2025-10-07 08:41:35.832452221 +0000 UTC m=+1502.641766365" lastFinishedPulling="2025-10-07 08:41:38.517922524 +0000 UTC m=+1505.327236688" observedRunningTime="2025-10-07 08:41:38.881762045 +0000 UTC m=+1505.691076189" watchObservedRunningTime="2025-10-07 08:41:38.882533559 +0000 UTC m=+1505.691847703" Oct 07 08:41:44 crc kubenswrapper[5025]: I1007 08:41:44.818970 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:44 crc kubenswrapper[5025]: I1007 08:41:44.819663 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:44 crc kubenswrapper[5025]: I1007 08:41:44.875025 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:44 crc kubenswrapper[5025]: I1007 08:41:44.959653 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:45 crc kubenswrapper[5025]: I1007 08:41:45.116619 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qxkcv"] Oct 07 08:41:46 crc kubenswrapper[5025]: I1007 08:41:46.931129 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qxkcv" podUID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerName="registry-server" containerID="cri-o://5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d" gracePeriod=2 Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.372259 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.472331 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-utilities\") pod \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.472395 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhgqz\" (UniqueName: \"kubernetes.io/projected/7a7e2ed0-632f-412b-9b9a-50ca4925b881-kube-api-access-vhgqz\") pod \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.472593 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-catalog-content\") pod \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\" (UID: \"7a7e2ed0-632f-412b-9b9a-50ca4925b881\") " Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.473959 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-utilities" (OuterVolumeSpecName: "utilities") pod "7a7e2ed0-632f-412b-9b9a-50ca4925b881" (UID: "7a7e2ed0-632f-412b-9b9a-50ca4925b881"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.478969 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7e2ed0-632f-412b-9b9a-50ca4925b881-kube-api-access-vhgqz" (OuterVolumeSpecName: "kube-api-access-vhgqz") pod "7a7e2ed0-632f-412b-9b9a-50ca4925b881" (UID: "7a7e2ed0-632f-412b-9b9a-50ca4925b881"). InnerVolumeSpecName "kube-api-access-vhgqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.574644 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.574675 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhgqz\" (UniqueName: \"kubernetes.io/projected/7a7e2ed0-632f-412b-9b9a-50ca4925b881-kube-api-access-vhgqz\") on node \"crc\" DevicePath \"\"" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.894011 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a7e2ed0-632f-412b-9b9a-50ca4925b881" (UID: "7a7e2ed0-632f-412b-9b9a-50ca4925b881"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.944335 5025 generic.go:334] "Generic (PLEG): container finished" podID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerID="5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d" exitCode=0 Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.944447 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkcv" event={"ID":"7a7e2ed0-632f-412b-9b9a-50ca4925b881","Type":"ContainerDied","Data":"5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d"} Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.944482 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkcv" event={"ID":"7a7e2ed0-632f-412b-9b9a-50ca4925b881","Type":"ContainerDied","Data":"b9a2ac96dc329e1dc60f4fc3bddd02d898268cfacd7314f2d3d5d073ca9d83bd"} Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.944644 5025 scope.go:117] "RemoveContainer" containerID="5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.944647 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxkcv" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.982570 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a7e2ed0-632f-412b-9b9a-50ca4925b881-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.996033 5025 scope.go:117] "RemoveContainer" containerID="9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168" Oct 07 08:41:47 crc kubenswrapper[5025]: I1007 08:41:47.999375 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qxkcv"] Oct 07 08:41:48 crc kubenswrapper[5025]: I1007 08:41:48.009375 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qxkcv"] Oct 07 08:41:48 crc kubenswrapper[5025]: I1007 08:41:48.023031 5025 scope.go:117] "RemoveContainer" containerID="afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683" Oct 07 08:41:48 crc kubenswrapper[5025]: I1007 08:41:48.047667 5025 scope.go:117] "RemoveContainer" containerID="5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d" Oct 07 08:41:48 crc kubenswrapper[5025]: E1007 08:41:48.047969 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d\": container with ID starting with 5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d not found: ID does not exist" containerID="5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d" Oct 07 08:41:48 crc kubenswrapper[5025]: I1007 08:41:48.048010 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d"} err="failed to get container status \"5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d\": rpc error: code = NotFound desc = could not find container \"5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d\": container with ID starting with 5ce0b01b37843128c1569a038982288166ef3db77e3d1eed64c83a4dd27bd08d not found: ID does not exist" Oct 07 08:41:48 crc kubenswrapper[5025]: I1007 08:41:48.048036 5025 scope.go:117] "RemoveContainer" containerID="9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168" Oct 07 08:41:48 crc kubenswrapper[5025]: E1007 08:41:48.048379 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168\": container with ID starting with 9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168 not found: ID does not exist" containerID="9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168" Oct 07 08:41:48 crc kubenswrapper[5025]: I1007 08:41:48.048431 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168"} err="failed to get container status \"9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168\": rpc error: code = NotFound desc = could not find container \"9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168\": container with ID starting with 9fd5c68ebf3eb27b8ddefd4cf933cdf682b077f05a0aed62dde7a17040da9168 not found: ID does not exist" Oct 07 08:41:48 crc kubenswrapper[5025]: I1007 08:41:48.048468 5025 scope.go:117] "RemoveContainer" containerID="afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683" Oct 07 08:41:48 crc kubenswrapper[5025]: E1007 08:41:48.048853 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683\": container with ID starting with afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683 not found: ID does not exist" containerID="afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683" Oct 07 08:41:48 crc kubenswrapper[5025]: I1007 08:41:48.048881 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683"} err="failed to get container status \"afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683\": rpc error: code = NotFound desc = could not find container \"afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683\": container with ID starting with afd08de70c9f3479ddd394f08dc80c0f97148a730b5d4b60389fa6490612a683 not found: ID does not exist" Oct 07 08:41:49 crc kubenswrapper[5025]: I1007 08:41:49.925342 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" path="/var/lib/kubelet/pods/7a7e2ed0-632f-412b-9b9a-50ca4925b881/volumes" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.554570 5025 scope.go:117] "RemoveContainer" containerID="2935be7adab1bdb854893d962c2c032e6db498d9775a70e4571b34552bcc75c4" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.590422 5025 scope.go:117] "RemoveContainer" containerID="b203a6d15ef792c4fe41369bbb2b58b37c71bfbaf5b3774eecf764a02b611548" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.619485 5025 scope.go:117] "RemoveContainer" containerID="77f7516d7962feba3cc7f2520d60c9bad2c5da055f1c07185c88455c01f36edc" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.643309 5025 scope.go:117] "RemoveContainer" containerID="da198bf061ed5643789e78bdd43ca4f1d884cca12b92765cf7109daf8ef661fa" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.661846 5025 scope.go:117] "RemoveContainer" containerID="1e3e97ecbee1eb5fa2d4e0c99b1a8d7a3e6632904a37d6f0cbcf4e9c9f40fc26" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.691682 5025 scope.go:117] "RemoveContainer" containerID="0d717114726acf550f907c6257fff0023c7b2d0e41f234c82b856a0d923017ad" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.725819 5025 scope.go:117] "RemoveContainer" containerID="37a99151a2993186ee570500792634aa31edd00f179d56fe2993e992303f092f" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.746037 5025 scope.go:117] "RemoveContainer" containerID="1658af060071457045658f1a0669689a5e3ecc8ca296630d6d86cf21a5155bc2" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.794731 5025 scope.go:117] "RemoveContainer" containerID="30c030347bd0a19a8d78643a4fcff89f0ca1bda4a5dfc49485c83c488909adad" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.824616 5025 scope.go:117] "RemoveContainer" containerID="6ad308ba6fec9a024cb9472b49f339f44bc5aac28f293abde8c8577bb792d9aa" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.846683 5025 scope.go:117] "RemoveContainer" containerID="d483b8c35a3ab7b95943ae78b74c71128746cf8491cf043bfc10563ed70df854" Oct 07 08:42:37 crc kubenswrapper[5025]: I1007 08:42:37.897666 5025 scope.go:117] "RemoveContainer" containerID="c59ccdf0da5c1ef53c5b930c654f443ffdf7143d032306c99a4d08861433c863" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.072187 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qbnkx"] Oct 07 08:43:23 crc kubenswrapper[5025]: E1007 08:43:23.073066 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerName="registry-server" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.073084 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerName="registry-server" Oct 07 08:43:23 crc kubenswrapper[5025]: E1007 08:43:23.073101 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerName="extract-utilities" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.073111 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerName="extract-utilities" Oct 07 08:43:23 crc kubenswrapper[5025]: E1007 08:43:23.073123 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerName="extract-content" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.073130 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerName="extract-content" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.073309 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7e2ed0-632f-412b-9b9a-50ca4925b881" containerName="registry-server" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.074549 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.087625 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbnkx"] Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.234032 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-catalog-content\") pod \"redhat-operators-qbnkx\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.234097 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8bgg\" (UniqueName: \"kubernetes.io/projected/e5a90ca4-afd3-41c2-b477-8bdfea5da556-kube-api-access-m8bgg\") pod \"redhat-operators-qbnkx\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.234123 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-utilities\") pod \"redhat-operators-qbnkx\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.335372 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-catalog-content\") pod \"redhat-operators-qbnkx\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.335470 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8bgg\" (UniqueName: \"kubernetes.io/projected/e5a90ca4-afd3-41c2-b477-8bdfea5da556-kube-api-access-m8bgg\") pod \"redhat-operators-qbnkx\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.335506 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-utilities\") pod \"redhat-operators-qbnkx\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.335908 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-catalog-content\") pod \"redhat-operators-qbnkx\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.336069 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-utilities\") pod \"redhat-operators-qbnkx\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.355646 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8bgg\" (UniqueName: \"kubernetes.io/projected/e5a90ca4-afd3-41c2-b477-8bdfea5da556-kube-api-access-m8bgg\") pod \"redhat-operators-qbnkx\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.395378 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:23 crc kubenswrapper[5025]: I1007 08:43:23.868139 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbnkx"] Oct 07 08:43:24 crc kubenswrapper[5025]: I1007 08:43:24.702739 5025 generic.go:334] "Generic (PLEG): container finished" podID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerID="73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb" exitCode=0 Oct 07 08:43:24 crc kubenswrapper[5025]: I1007 08:43:24.702907 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbnkx" event={"ID":"e5a90ca4-afd3-41c2-b477-8bdfea5da556","Type":"ContainerDied","Data":"73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb"} Oct 07 08:43:24 crc kubenswrapper[5025]: I1007 08:43:24.703066 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbnkx" event={"ID":"e5a90ca4-afd3-41c2-b477-8bdfea5da556","Type":"ContainerStarted","Data":"3092868a45a612716935a8de4d7ab3727dd90387b699693088be1dc2288149ca"} Oct 07 08:43:25 crc kubenswrapper[5025]: I1007 08:43:25.711392 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbnkx" event={"ID":"e5a90ca4-afd3-41c2-b477-8bdfea5da556","Type":"ContainerStarted","Data":"11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582"} Oct 07 08:43:25 crc kubenswrapper[5025]: I1007 08:43:25.934419 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:43:25 crc kubenswrapper[5025]: I1007 08:43:25.934483 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:43:26 crc kubenswrapper[5025]: I1007 08:43:26.734320 5025 generic.go:334] "Generic (PLEG): container finished" podID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerID="11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582" exitCode=0 Oct 07 08:43:26 crc kubenswrapper[5025]: I1007 08:43:26.734375 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbnkx" event={"ID":"e5a90ca4-afd3-41c2-b477-8bdfea5da556","Type":"ContainerDied","Data":"11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582"} Oct 07 08:43:27 crc kubenswrapper[5025]: I1007 08:43:27.742957 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbnkx" event={"ID":"e5a90ca4-afd3-41c2-b477-8bdfea5da556","Type":"ContainerStarted","Data":"0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e"} Oct 07 08:43:27 crc kubenswrapper[5025]: I1007 08:43:27.764833 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qbnkx" podStartSLOduration=2.314058181 podStartE2EDuration="4.764808328s" podCreationTimestamp="2025-10-07 08:43:23 +0000 UTC" firstStartedPulling="2025-10-07 08:43:24.70524154 +0000 UTC m=+1611.514555684" lastFinishedPulling="2025-10-07 08:43:27.155991687 +0000 UTC m=+1613.965305831" observedRunningTime="2025-10-07 08:43:27.758142758 +0000 UTC m=+1614.567456902" watchObservedRunningTime="2025-10-07 08:43:27.764808328 +0000 UTC m=+1614.574122472" Oct 07 08:43:33 crc kubenswrapper[5025]: I1007 08:43:33.396461 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:33 crc kubenswrapper[5025]: I1007 08:43:33.397734 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:33 crc kubenswrapper[5025]: I1007 08:43:33.470373 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:33 crc kubenswrapper[5025]: I1007 08:43:33.825194 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:33 crc kubenswrapper[5025]: I1007 08:43:33.871936 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbnkx"] Oct 07 08:43:35 crc kubenswrapper[5025]: I1007 08:43:35.793690 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qbnkx" podUID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerName="registry-server" containerID="cri-o://0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e" gracePeriod=2 Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.096631 5025 scope.go:117] "RemoveContainer" containerID="10b9f5700cc4e24590cffcbcde7cac0af1a3631605fcea78d34ab2f307f7a30d" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.118062 5025 scope.go:117] "RemoveContainer" containerID="9e3b82bfc7c4a84f47445e6f6f09d41612896aabe03608fa33cd4c6d834bc03a" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.136817 5025 scope.go:117] "RemoveContainer" containerID="314b1fef7c16ec7df031546fcac4dc5d738a574b43a61d84da7fecc9a9489dbb" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.536448 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.643046 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-utilities\") pod \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.643199 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8bgg\" (UniqueName: \"kubernetes.io/projected/e5a90ca4-afd3-41c2-b477-8bdfea5da556-kube-api-access-m8bgg\") pod \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.643433 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-catalog-content\") pod \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\" (UID: \"e5a90ca4-afd3-41c2-b477-8bdfea5da556\") " Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.644199 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-utilities" (OuterVolumeSpecName: "utilities") pod "e5a90ca4-afd3-41c2-b477-8bdfea5da556" (UID: "e5a90ca4-afd3-41c2-b477-8bdfea5da556"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.651390 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a90ca4-afd3-41c2-b477-8bdfea5da556-kube-api-access-m8bgg" (OuterVolumeSpecName: "kube-api-access-m8bgg") pod "e5a90ca4-afd3-41c2-b477-8bdfea5da556" (UID: "e5a90ca4-afd3-41c2-b477-8bdfea5da556"). InnerVolumeSpecName "kube-api-access-m8bgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.744874 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.744911 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8bgg\" (UniqueName: \"kubernetes.io/projected/e5a90ca4-afd3-41c2-b477-8bdfea5da556-kube-api-access-m8bgg\") on node \"crc\" DevicePath \"\"" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.752719 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5a90ca4-afd3-41c2-b477-8bdfea5da556" (UID: "e5a90ca4-afd3-41c2-b477-8bdfea5da556"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.816954 5025 generic.go:334] "Generic (PLEG): container finished" podID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerID="0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e" exitCode=0 Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.817005 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbnkx" event={"ID":"e5a90ca4-afd3-41c2-b477-8bdfea5da556","Type":"ContainerDied","Data":"0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e"} Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.817048 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbnkx" event={"ID":"e5a90ca4-afd3-41c2-b477-8bdfea5da556","Type":"ContainerDied","Data":"3092868a45a612716935a8de4d7ab3727dd90387b699693088be1dc2288149ca"} Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.817046 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbnkx" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.817083 5025 scope.go:117] "RemoveContainer" containerID="0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.836811 5025 scope.go:117] "RemoveContainer" containerID="11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.845950 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a90ca4-afd3-41c2-b477-8bdfea5da556-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.847241 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbnkx"] Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.853486 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qbnkx"] Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.860135 5025 scope.go:117] "RemoveContainer" containerID="73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.884139 5025 scope.go:117] "RemoveContainer" containerID="0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e" Oct 07 08:43:38 crc kubenswrapper[5025]: E1007 08:43:38.885064 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e\": container with ID starting with 0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e not found: ID does not exist" containerID="0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.885114 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e"} err="failed to get container status \"0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e\": rpc error: code = NotFound desc = could not find container \"0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e\": container with ID starting with 0e8279c1dc6c34ed7e401ad65e860112f4a2a6515fd5347a4510b26eed85070e not found: ID does not exist" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.885146 5025 scope.go:117] "RemoveContainer" containerID="11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582" Oct 07 08:43:38 crc kubenswrapper[5025]: E1007 08:43:38.885669 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582\": container with ID starting with 11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582 not found: ID does not exist" containerID="11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.885826 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582"} err="failed to get container status \"11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582\": rpc error: code = NotFound desc = could not find container \"11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582\": container with ID starting with 11f5fa869c485eaf059c67c013c7c8b22312f1fc37a76a46f8de8aa0c0369582 not found: ID does not exist" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.885951 5025 scope.go:117] "RemoveContainer" containerID="73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb" Oct 07 08:43:38 crc kubenswrapper[5025]: E1007 08:43:38.886474 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb\": container with ID starting with 73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb not found: ID does not exist" containerID="73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb" Oct 07 08:43:38 crc kubenswrapper[5025]: I1007 08:43:38.886505 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb"} err="failed to get container status \"73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb\": rpc error: code = NotFound desc = could not find container \"73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb\": container with ID starting with 73c374ad3f7e9da0e7798b6419a90a80e44e91464f19f8baaf148773d109b0eb not found: ID does not exist" Oct 07 08:43:39 crc kubenswrapper[5025]: I1007 08:43:39.924107 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" path="/var/lib/kubelet/pods/e5a90ca4-afd3-41c2-b477-8bdfea5da556/volumes" Oct 07 08:43:55 crc kubenswrapper[5025]: I1007 08:43:55.934569 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:43:55 crc kubenswrapper[5025]: I1007 08:43:55.935161 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:44:25 crc kubenswrapper[5025]: I1007 08:44:25.934368 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:44:25 crc kubenswrapper[5025]: I1007 08:44:25.935823 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:44:25 crc kubenswrapper[5025]: I1007 08:44:25.935950 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:44:25 crc kubenswrapper[5025]: I1007 08:44:25.936629 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:44:25 crc kubenswrapper[5025]: I1007 08:44:25.936685 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" gracePeriod=600 Oct 07 08:44:26 crc kubenswrapper[5025]: E1007 08:44:26.068681 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:44:26 crc kubenswrapper[5025]: I1007 08:44:26.173129 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" exitCode=0 Oct 07 08:44:26 crc kubenswrapper[5025]: I1007 08:44:26.173183 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9"} Oct 07 08:44:26 crc kubenswrapper[5025]: I1007 08:44:26.173259 5025 scope.go:117] "RemoveContainer" containerID="13140e5c73fc6f443f1a7e1d646b5e6ed1f83f36c3c46081a22a0e60d0f1c23e" Oct 07 08:44:26 crc kubenswrapper[5025]: I1007 08:44:26.173969 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:44:26 crc kubenswrapper[5025]: E1007 08:44:26.174386 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:44:36 crc kubenswrapper[5025]: I1007 08:44:36.914267 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:44:36 crc kubenswrapper[5025]: E1007 08:44:36.914989 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:44:38 crc kubenswrapper[5025]: I1007 08:44:38.218238 5025 scope.go:117] "RemoveContainer" containerID="3dea8fbfe81c9050731ad9beb8566abea1bfd95a8e42f4666c9d09f77e5b122b" Oct 07 08:44:38 crc kubenswrapper[5025]: I1007 08:44:38.253731 5025 scope.go:117] "RemoveContainer" containerID="24e26a1c9452261b7222faac9bd3c8759890a7e885ed4f0a26471b46a6648ae1" Oct 07 08:44:38 crc kubenswrapper[5025]: I1007 08:44:38.279496 5025 scope.go:117] "RemoveContainer" containerID="d89c8dd7b87c42dab76a0b14969e3fb390ccaaeddad4b9fda9391555044db71a" Oct 07 08:44:38 crc kubenswrapper[5025]: I1007 08:44:38.314712 5025 scope.go:117] "RemoveContainer" containerID="399bca0216396656c867a3f5730b1d3a96e7ba8bd1d46dd81ca0fc9e3d104c68" Oct 07 08:44:51 crc kubenswrapper[5025]: I1007 08:44:51.915405 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:44:51 crc kubenswrapper[5025]: E1007 08:44:51.916242 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.148815 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk"] Oct 07 08:45:00 crc kubenswrapper[5025]: E1007 08:45:00.149761 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerName="extract-content" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.149778 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerName="extract-content" Oct 07 08:45:00 crc kubenswrapper[5025]: E1007 08:45:00.149810 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerName="extract-utilities" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.149818 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerName="extract-utilities" Oct 07 08:45:00 crc kubenswrapper[5025]: E1007 08:45:00.149839 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerName="registry-server" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.149846 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerName="registry-server" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.149995 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a90ca4-afd3-41c2-b477-8bdfea5da556" containerName="registry-server" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.150571 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.153906 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.154208 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.182024 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk"] Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.289856 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cnph\" (UniqueName: \"kubernetes.io/projected/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-kube-api-access-9cnph\") pod \"collect-profiles-29330445-mdrrk\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.289936 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-config-volume\") pod \"collect-profiles-29330445-mdrrk\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.290065 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-secret-volume\") pod \"collect-profiles-29330445-mdrrk\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.391925 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cnph\" (UniqueName: \"kubernetes.io/projected/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-kube-api-access-9cnph\") pod \"collect-profiles-29330445-mdrrk\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.391984 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-config-volume\") pod \"collect-profiles-29330445-mdrrk\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.392037 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-secret-volume\") pod \"collect-profiles-29330445-mdrrk\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.392992 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-config-volume\") pod \"collect-profiles-29330445-mdrrk\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.411569 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-secret-volume\") pod \"collect-profiles-29330445-mdrrk\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.414660 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cnph\" (UniqueName: \"kubernetes.io/projected/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-kube-api-access-9cnph\") pod \"collect-profiles-29330445-mdrrk\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.471216 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:00 crc kubenswrapper[5025]: I1007 08:45:00.873912 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk"] Oct 07 08:45:01 crc kubenswrapper[5025]: I1007 08:45:01.447489 5025 generic.go:334] "Generic (PLEG): container finished" podID="a1583866-07df-4b61-a0ce-4c1e8a22a9d2" containerID="83a9f01c523837701666b1ae300eb68228bb9f28f77925bcd47f369a31461bc0" exitCode=0 Oct 07 08:45:01 crc kubenswrapper[5025]: I1007 08:45:01.447566 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" event={"ID":"a1583866-07df-4b61-a0ce-4c1e8a22a9d2","Type":"ContainerDied","Data":"83a9f01c523837701666b1ae300eb68228bb9f28f77925bcd47f369a31461bc0"} Oct 07 08:45:01 crc kubenswrapper[5025]: I1007 08:45:01.447874 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" event={"ID":"a1583866-07df-4b61-a0ce-4c1e8a22a9d2","Type":"ContainerStarted","Data":"af6f9a596ea899cc76cfb6c524368f89a19a521dce104c59b1dd4e23dfb73755"} Oct 07 08:45:02 crc kubenswrapper[5025]: I1007 08:45:02.762307 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:02 crc kubenswrapper[5025]: I1007 08:45:02.914406 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:45:02 crc kubenswrapper[5025]: E1007 08:45:02.914710 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:45:02 crc kubenswrapper[5025]: I1007 08:45:02.925906 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cnph\" (UniqueName: \"kubernetes.io/projected/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-kube-api-access-9cnph\") pod \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " Oct 07 08:45:02 crc kubenswrapper[5025]: I1007 08:45:02.925967 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-config-volume\") pod \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " Oct 07 08:45:02 crc kubenswrapper[5025]: I1007 08:45:02.926027 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-secret-volume\") pod \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\" (UID: \"a1583866-07df-4b61-a0ce-4c1e8a22a9d2\") " Oct 07 08:45:02 crc kubenswrapper[5025]: I1007 08:45:02.926805 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "a1583866-07df-4b61-a0ce-4c1e8a22a9d2" (UID: "a1583866-07df-4b61-a0ce-4c1e8a22a9d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 08:45:02 crc kubenswrapper[5025]: I1007 08:45:02.931438 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a1583866-07df-4b61-a0ce-4c1e8a22a9d2" (UID: "a1583866-07df-4b61-a0ce-4c1e8a22a9d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 08:45:02 crc kubenswrapper[5025]: I1007 08:45:02.932190 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-kube-api-access-9cnph" (OuterVolumeSpecName: "kube-api-access-9cnph") pod "a1583866-07df-4b61-a0ce-4c1e8a22a9d2" (UID: "a1583866-07df-4b61-a0ce-4c1e8a22a9d2"). InnerVolumeSpecName "kube-api-access-9cnph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:45:03 crc kubenswrapper[5025]: I1007 08:45:03.027921 5025 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 08:45:03 crc kubenswrapper[5025]: I1007 08:45:03.027951 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cnph\" (UniqueName: \"kubernetes.io/projected/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-kube-api-access-9cnph\") on node \"crc\" DevicePath \"\"" Oct 07 08:45:03 crc kubenswrapper[5025]: I1007 08:45:03.027959 5025 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1583866-07df-4b61-a0ce-4c1e8a22a9d2-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 08:45:03 crc kubenswrapper[5025]: I1007 08:45:03.463994 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" event={"ID":"a1583866-07df-4b61-a0ce-4c1e8a22a9d2","Type":"ContainerDied","Data":"af6f9a596ea899cc76cfb6c524368f89a19a521dce104c59b1dd4e23dfb73755"} Oct 07 08:45:03 crc kubenswrapper[5025]: I1007 08:45:03.464035 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af6f9a596ea899cc76cfb6c524368f89a19a521dce104c59b1dd4e23dfb73755" Oct 07 08:45:03 crc kubenswrapper[5025]: I1007 08:45:03.464075 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk" Oct 07 08:45:13 crc kubenswrapper[5025]: I1007 08:45:13.919694 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:45:13 crc kubenswrapper[5025]: E1007 08:45:13.920500 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:45:25 crc kubenswrapper[5025]: I1007 08:45:25.914995 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:45:25 crc kubenswrapper[5025]: E1007 08:45:25.916187 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:45:40 crc kubenswrapper[5025]: I1007 08:45:40.915717 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:45:40 crc kubenswrapper[5025]: E1007 08:45:40.916724 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:45:52 crc kubenswrapper[5025]: I1007 08:45:52.914374 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:45:52 crc kubenswrapper[5025]: E1007 08:45:52.915181 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:46:03 crc kubenswrapper[5025]: I1007 08:46:03.918473 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:46:03 crc kubenswrapper[5025]: E1007 08:46:03.919127 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:46:18 crc kubenswrapper[5025]: I1007 08:46:18.915628 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:46:18 crc kubenswrapper[5025]: E1007 08:46:18.916801 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:46:29 crc kubenswrapper[5025]: I1007 08:46:29.914188 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:46:29 crc kubenswrapper[5025]: E1007 08:46:29.914923 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:46:40 crc kubenswrapper[5025]: I1007 08:46:40.914790 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:46:40 crc kubenswrapper[5025]: E1007 08:46:40.915589 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:46:54 crc kubenswrapper[5025]: I1007 08:46:54.914758 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:46:54 crc kubenswrapper[5025]: E1007 08:46:54.915429 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:47:08 crc kubenswrapper[5025]: I1007 08:47:08.914785 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:47:08 crc kubenswrapper[5025]: E1007 08:47:08.915379 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:47:21 crc kubenswrapper[5025]: I1007 08:47:21.915574 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:47:21 crc kubenswrapper[5025]: E1007 08:47:21.916250 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:47:34 crc kubenswrapper[5025]: I1007 08:47:34.915833 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:47:34 crc kubenswrapper[5025]: E1007 08:47:34.917659 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:47:38 crc kubenswrapper[5025]: I1007 08:47:38.464026 5025 scope.go:117] "RemoveContainer" containerID="228821ec0681da14dae65968a3dadda92d5674bce838e0b0942a6dc0af85f343" Oct 07 08:47:38 crc kubenswrapper[5025]: I1007 08:47:38.489228 5025 scope.go:117] "RemoveContainer" containerID="bf5dc405b9eac171f0a8d9f5fa8eebd52d697bb48aeabca069d43ac12e465701" Oct 07 08:47:38 crc kubenswrapper[5025]: I1007 08:47:38.519595 5025 scope.go:117] "RemoveContainer" containerID="b3cebeadac08ee6f2f6c8d91201e86989b618610cf0af1e5aa7c7cf332b20124" Oct 07 08:47:49 crc kubenswrapper[5025]: I1007 08:47:49.915610 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:47:49 crc kubenswrapper[5025]: E1007 08:47:49.917484 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:48:04 crc kubenswrapper[5025]: I1007 08:48:04.914494 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:48:04 crc kubenswrapper[5025]: E1007 08:48:04.915499 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:48:18 crc kubenswrapper[5025]: I1007 08:48:18.915403 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:48:18 crc kubenswrapper[5025]: E1007 08:48:18.916416 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:48:29 crc kubenswrapper[5025]: I1007 08:48:29.915656 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:48:29 crc kubenswrapper[5025]: E1007 08:48:29.916662 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:48:40 crc kubenswrapper[5025]: I1007 08:48:40.915445 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:48:40 crc kubenswrapper[5025]: E1007 08:48:40.916087 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:48:55 crc kubenswrapper[5025]: I1007 08:48:55.914602 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:48:55 crc kubenswrapper[5025]: E1007 08:48:55.915523 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:49:08 crc kubenswrapper[5025]: I1007 08:49:08.914936 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:49:08 crc kubenswrapper[5025]: E1007 08:49:08.916144 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:49:21 crc kubenswrapper[5025]: I1007 08:49:21.914951 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:49:21 crc kubenswrapper[5025]: E1007 08:49:21.915935 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:49:34 crc kubenswrapper[5025]: I1007 08:49:34.914258 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:49:35 crc kubenswrapper[5025]: I1007 08:49:35.549123 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"013eb7ba27e6f88cb2743b9c7df2653632623e0741c302a71b20efaf02f2635f"} Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.345926 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s9d8s"] Oct 07 08:51:01 crc kubenswrapper[5025]: E1007 08:51:01.346808 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1583866-07df-4b61-a0ce-4c1e8a22a9d2" containerName="collect-profiles" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.346824 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1583866-07df-4b61-a0ce-4c1e8a22a9d2" containerName="collect-profiles" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.347006 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1583866-07df-4b61-a0ce-4c1e8a22a9d2" containerName="collect-profiles" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.348370 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.358236 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9d8s"] Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.414442 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-utilities\") pod \"certified-operators-s9d8s\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.414499 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-catalog-content\") pod \"certified-operators-s9d8s\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.414533 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2527\" (UniqueName: \"kubernetes.io/projected/7a946072-69be-4d55-bb87-97b79f234654-kube-api-access-h2527\") pod \"certified-operators-s9d8s\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.516083 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-utilities\") pod \"certified-operators-s9d8s\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.516152 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-catalog-content\") pod \"certified-operators-s9d8s\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.516192 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2527\" (UniqueName: \"kubernetes.io/projected/7a946072-69be-4d55-bb87-97b79f234654-kube-api-access-h2527\") pod \"certified-operators-s9d8s\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.516669 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-utilities\") pod \"certified-operators-s9d8s\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.517522 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-catalog-content\") pod \"certified-operators-s9d8s\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.537242 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2527\" (UniqueName: \"kubernetes.io/projected/7a946072-69be-4d55-bb87-97b79f234654-kube-api-access-h2527\") pod \"certified-operators-s9d8s\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:01 crc kubenswrapper[5025]: I1007 08:51:01.688484 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:02 crc kubenswrapper[5025]: I1007 08:51:02.157792 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9d8s"] Oct 07 08:51:02 crc kubenswrapper[5025]: I1007 08:51:02.219897 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9d8s" event={"ID":"7a946072-69be-4d55-bb87-97b79f234654","Type":"ContainerStarted","Data":"ac33cb4e460cbbb5190bb5285cb0f9cca3e38368059f3d6235bd2003f6e8fe36"} Oct 07 08:51:03 crc kubenswrapper[5025]: I1007 08:51:03.227529 5025 generic.go:334] "Generic (PLEG): container finished" podID="7a946072-69be-4d55-bb87-97b79f234654" containerID="5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e" exitCode=0 Oct 07 08:51:03 crc kubenswrapper[5025]: I1007 08:51:03.227628 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9d8s" event={"ID":"7a946072-69be-4d55-bb87-97b79f234654","Type":"ContainerDied","Data":"5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e"} Oct 07 08:51:03 crc kubenswrapper[5025]: I1007 08:51:03.229885 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 08:51:04 crc kubenswrapper[5025]: I1007 08:51:04.237932 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9d8s" event={"ID":"7a946072-69be-4d55-bb87-97b79f234654","Type":"ContainerStarted","Data":"a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99"} Oct 07 08:51:05 crc kubenswrapper[5025]: I1007 08:51:05.253377 5025 generic.go:334] "Generic (PLEG): container finished" podID="7a946072-69be-4d55-bb87-97b79f234654" containerID="a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99" exitCode=0 Oct 07 08:51:05 crc kubenswrapper[5025]: I1007 08:51:05.253479 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9d8s" event={"ID":"7a946072-69be-4d55-bb87-97b79f234654","Type":"ContainerDied","Data":"a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99"} Oct 07 08:51:06 crc kubenswrapper[5025]: I1007 08:51:06.263977 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9d8s" event={"ID":"7a946072-69be-4d55-bb87-97b79f234654","Type":"ContainerStarted","Data":"073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1"} Oct 07 08:51:06 crc kubenswrapper[5025]: I1007 08:51:06.283314 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s9d8s" podStartSLOduration=2.516100376 podStartE2EDuration="5.283293387s" podCreationTimestamp="2025-10-07 08:51:01 +0000 UTC" firstStartedPulling="2025-10-07 08:51:03.229624001 +0000 UTC m=+2070.038938145" lastFinishedPulling="2025-10-07 08:51:05.996817012 +0000 UTC m=+2072.806131156" observedRunningTime="2025-10-07 08:51:06.280154797 +0000 UTC m=+2073.089468961" watchObservedRunningTime="2025-10-07 08:51:06.283293387 +0000 UTC m=+2073.092607531" Oct 07 08:51:11 crc kubenswrapper[5025]: I1007 08:51:11.688979 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:11 crc kubenswrapper[5025]: I1007 08:51:11.689398 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:11 crc kubenswrapper[5025]: I1007 08:51:11.738698 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:12 crc kubenswrapper[5025]: I1007 08:51:12.363364 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:12 crc kubenswrapper[5025]: I1007 08:51:12.408304 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9d8s"] Oct 07 08:51:14 crc kubenswrapper[5025]: I1007 08:51:14.317480 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s9d8s" podUID="7a946072-69be-4d55-bb87-97b79f234654" containerName="registry-server" containerID="cri-o://073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1" gracePeriod=2 Oct 07 08:51:14 crc kubenswrapper[5025]: I1007 08:51:14.725819 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:14 crc kubenswrapper[5025]: I1007 08:51:14.814448 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-utilities\") pod \"7a946072-69be-4d55-bb87-97b79f234654\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " Oct 07 08:51:14 crc kubenswrapper[5025]: I1007 08:51:14.814508 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2527\" (UniqueName: \"kubernetes.io/projected/7a946072-69be-4d55-bb87-97b79f234654-kube-api-access-h2527\") pod \"7a946072-69be-4d55-bb87-97b79f234654\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " Oct 07 08:51:14 crc kubenswrapper[5025]: I1007 08:51:14.814646 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-catalog-content\") pod \"7a946072-69be-4d55-bb87-97b79f234654\" (UID: \"7a946072-69be-4d55-bb87-97b79f234654\") " Oct 07 08:51:14 crc kubenswrapper[5025]: I1007 08:51:14.815399 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-utilities" (OuterVolumeSpecName: "utilities") pod "7a946072-69be-4d55-bb87-97b79f234654" (UID: "7a946072-69be-4d55-bb87-97b79f234654"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:51:14 crc kubenswrapper[5025]: I1007 08:51:14.822852 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a946072-69be-4d55-bb87-97b79f234654-kube-api-access-h2527" (OuterVolumeSpecName: "kube-api-access-h2527") pod "7a946072-69be-4d55-bb87-97b79f234654" (UID: "7a946072-69be-4d55-bb87-97b79f234654"). InnerVolumeSpecName "kube-api-access-h2527". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:51:14 crc kubenswrapper[5025]: I1007 08:51:14.916321 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2527\" (UniqueName: \"kubernetes.io/projected/7a946072-69be-4d55-bb87-97b79f234654-kube-api-access-h2527\") on node \"crc\" DevicePath \"\"" Oct 07 08:51:14 crc kubenswrapper[5025]: I1007 08:51:14.916365 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.327159 5025 generic.go:334] "Generic (PLEG): container finished" podID="7a946072-69be-4d55-bb87-97b79f234654" containerID="073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1" exitCode=0 Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.327241 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9d8s" event={"ID":"7a946072-69be-4d55-bb87-97b79f234654","Type":"ContainerDied","Data":"073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1"} Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.327255 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9d8s" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.327295 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9d8s" event={"ID":"7a946072-69be-4d55-bb87-97b79f234654","Type":"ContainerDied","Data":"ac33cb4e460cbbb5190bb5285cb0f9cca3e38368059f3d6235bd2003f6e8fe36"} Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.327322 5025 scope.go:117] "RemoveContainer" containerID="073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.351254 5025 scope.go:117] "RemoveContainer" containerID="a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.375915 5025 scope.go:117] "RemoveContainer" containerID="5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.400978 5025 scope.go:117] "RemoveContainer" containerID="073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1" Oct 07 08:51:15 crc kubenswrapper[5025]: E1007 08:51:15.401762 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1\": container with ID starting with 073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1 not found: ID does not exist" containerID="073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.401828 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1"} err="failed to get container status \"073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1\": rpc error: code = NotFound desc = could not find container \"073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1\": container with ID starting with 073659a534236d95c434dadf33b21b495e267855cbf0522e227c7113f1bc82f1 not found: ID does not exist" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.401871 5025 scope.go:117] "RemoveContainer" containerID="a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99" Oct 07 08:51:15 crc kubenswrapper[5025]: E1007 08:51:15.402351 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99\": container with ID starting with a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99 not found: ID does not exist" containerID="a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.402384 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99"} err="failed to get container status \"a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99\": rpc error: code = NotFound desc = could not find container \"a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99\": container with ID starting with a035ced7d2470a7160a6e1f5797e56676f1360493f785f274de2569d24d39e99 not found: ID does not exist" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.402403 5025 scope.go:117] "RemoveContainer" containerID="5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e" Oct 07 08:51:15 crc kubenswrapper[5025]: E1007 08:51:15.402749 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e\": container with ID starting with 5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e not found: ID does not exist" containerID="5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.402782 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e"} err="failed to get container status \"5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e\": rpc error: code = NotFound desc = could not find container \"5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e\": container with ID starting with 5dbdaa86e8eb049baabe41f36b22eddd14d69781cb5e03816fdfb24cf532466e not found: ID does not exist" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.502804 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a946072-69be-4d55-bb87-97b79f234654" (UID: "7a946072-69be-4d55-bb87-97b79f234654"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.527736 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a946072-69be-4d55-bb87-97b79f234654-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.665482 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9d8s"] Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.671071 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s9d8s"] Oct 07 08:51:15 crc kubenswrapper[5025]: I1007 08:51:15.926981 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a946072-69be-4d55-bb87-97b79f234654" path="/var/lib/kubelet/pods/7a946072-69be-4d55-bb87-97b79f234654/volumes" Oct 07 08:51:55 crc kubenswrapper[5025]: I1007 08:51:55.935120 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:51:55 crc kubenswrapper[5025]: I1007 08:51:55.935785 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.580767 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l5jcs"] Oct 07 08:51:57 crc kubenswrapper[5025]: E1007 08:51:57.581115 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a946072-69be-4d55-bb87-97b79f234654" containerName="extract-content" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.581130 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a946072-69be-4d55-bb87-97b79f234654" containerName="extract-content" Oct 07 08:51:57 crc kubenswrapper[5025]: E1007 08:51:57.581144 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a946072-69be-4d55-bb87-97b79f234654" containerName="extract-utilities" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.581151 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a946072-69be-4d55-bb87-97b79f234654" containerName="extract-utilities" Oct 07 08:51:57 crc kubenswrapper[5025]: E1007 08:51:57.581182 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a946072-69be-4d55-bb87-97b79f234654" containerName="registry-server" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.581188 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a946072-69be-4d55-bb87-97b79f234654" containerName="registry-server" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.581349 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a946072-69be-4d55-bb87-97b79f234654" containerName="registry-server" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.582526 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.605359 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l5jcs"] Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.645370 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-catalog-content\") pod \"community-operators-l5jcs\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.645456 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm8kj\" (UniqueName: \"kubernetes.io/projected/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-kube-api-access-xm8kj\") pod \"community-operators-l5jcs\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.645720 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-utilities\") pod \"community-operators-l5jcs\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.747213 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-utilities\") pod \"community-operators-l5jcs\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.747301 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-catalog-content\") pod \"community-operators-l5jcs\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.747354 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm8kj\" (UniqueName: \"kubernetes.io/projected/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-kube-api-access-xm8kj\") pod \"community-operators-l5jcs\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.747764 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-utilities\") pod \"community-operators-l5jcs\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.747788 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-catalog-content\") pod \"community-operators-l5jcs\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.778627 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm8kj\" (UniqueName: \"kubernetes.io/projected/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-kube-api-access-xm8kj\") pod \"community-operators-l5jcs\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:57 crc kubenswrapper[5025]: I1007 08:51:57.911379 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:51:58 crc kubenswrapper[5025]: I1007 08:51:58.211825 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l5jcs"] Oct 07 08:51:58 crc kubenswrapper[5025]: I1007 08:51:58.694751 5025 generic.go:334] "Generic (PLEG): container finished" podID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerID="6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed" exitCode=0 Oct 07 08:51:58 crc kubenswrapper[5025]: I1007 08:51:58.694797 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5jcs" event={"ID":"a8bb29f5-c5f9-4068-95a3-2723e439c8b6","Type":"ContainerDied","Data":"6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed"} Oct 07 08:51:58 crc kubenswrapper[5025]: I1007 08:51:58.694826 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5jcs" event={"ID":"a8bb29f5-c5f9-4068-95a3-2723e439c8b6","Type":"ContainerStarted","Data":"ff00c648f191d6a537113ca83f7e8c9610bd15ba228d250e42b05c42f37415bf"} Oct 07 08:52:00 crc kubenswrapper[5025]: I1007 08:52:00.711860 5025 generic.go:334] "Generic (PLEG): container finished" podID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerID="65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d" exitCode=0 Oct 07 08:52:00 crc kubenswrapper[5025]: I1007 08:52:00.712269 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5jcs" event={"ID":"a8bb29f5-c5f9-4068-95a3-2723e439c8b6","Type":"ContainerDied","Data":"65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d"} Oct 07 08:52:01 crc kubenswrapper[5025]: I1007 08:52:01.720866 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5jcs" event={"ID":"a8bb29f5-c5f9-4068-95a3-2723e439c8b6","Type":"ContainerStarted","Data":"4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2"} Oct 07 08:52:01 crc kubenswrapper[5025]: I1007 08:52:01.743496 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l5jcs" podStartSLOduration=2.212661686 podStartE2EDuration="4.74347743s" podCreationTimestamp="2025-10-07 08:51:57 +0000 UTC" firstStartedPulling="2025-10-07 08:51:58.697150937 +0000 UTC m=+2125.506465081" lastFinishedPulling="2025-10-07 08:52:01.227966681 +0000 UTC m=+2128.037280825" observedRunningTime="2025-10-07 08:52:01.741017822 +0000 UTC m=+2128.550331976" watchObservedRunningTime="2025-10-07 08:52:01.74347743 +0000 UTC m=+2128.552791574" Oct 07 08:52:07 crc kubenswrapper[5025]: I1007 08:52:07.911661 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:52:07 crc kubenswrapper[5025]: I1007 08:52:07.912252 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:52:07 crc kubenswrapper[5025]: I1007 08:52:07.958618 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:52:08 crc kubenswrapper[5025]: I1007 08:52:08.833900 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:52:08 crc kubenswrapper[5025]: I1007 08:52:08.880735 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l5jcs"] Oct 07 08:52:10 crc kubenswrapper[5025]: I1007 08:52:10.788331 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l5jcs" podUID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerName="registry-server" containerID="cri-o://4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2" gracePeriod=2 Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.715407 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.800212 5025 generic.go:334] "Generic (PLEG): container finished" podID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerID="4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2" exitCode=0 Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.800259 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5jcs" event={"ID":"a8bb29f5-c5f9-4068-95a3-2723e439c8b6","Type":"ContainerDied","Data":"4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2"} Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.800265 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5jcs" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.800289 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5jcs" event={"ID":"a8bb29f5-c5f9-4068-95a3-2723e439c8b6","Type":"ContainerDied","Data":"ff00c648f191d6a537113ca83f7e8c9610bd15ba228d250e42b05c42f37415bf"} Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.800312 5025 scope.go:117] "RemoveContainer" containerID="4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.828038 5025 scope.go:117] "RemoveContainer" containerID="65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.854207 5025 scope.go:117] "RemoveContainer" containerID="6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.857461 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-utilities\") pod \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.857624 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm8kj\" (UniqueName: \"kubernetes.io/projected/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-kube-api-access-xm8kj\") pod \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.857703 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-catalog-content\") pod \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\" (UID: \"a8bb29f5-c5f9-4068-95a3-2723e439c8b6\") " Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.858528 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-utilities" (OuterVolumeSpecName: "utilities") pod "a8bb29f5-c5f9-4068-95a3-2723e439c8b6" (UID: "a8bb29f5-c5f9-4068-95a3-2723e439c8b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.864845 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-kube-api-access-xm8kj" (OuterVolumeSpecName: "kube-api-access-xm8kj") pod "a8bb29f5-c5f9-4068-95a3-2723e439c8b6" (UID: "a8bb29f5-c5f9-4068-95a3-2723e439c8b6"). InnerVolumeSpecName "kube-api-access-xm8kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.877198 5025 scope.go:117] "RemoveContainer" containerID="4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2" Oct 07 08:52:11 crc kubenswrapper[5025]: E1007 08:52:11.878200 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2\": container with ID starting with 4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2 not found: ID does not exist" containerID="4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.878251 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2"} err="failed to get container status \"4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2\": rpc error: code = NotFound desc = could not find container \"4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2\": container with ID starting with 4420e79d3a334049a443f4f6a5c1a4f01d77e6577c84291515644e9ae53c6cb2 not found: ID does not exist" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.878280 5025 scope.go:117] "RemoveContainer" containerID="65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d" Oct 07 08:52:11 crc kubenswrapper[5025]: E1007 08:52:11.879117 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d\": container with ID starting with 65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d not found: ID does not exist" containerID="65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.879169 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d"} err="failed to get container status \"65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d\": rpc error: code = NotFound desc = could not find container \"65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d\": container with ID starting with 65396420d153499dd940bd648ba6c9cb290faaba460d878711fbd8cbab4c762d not found: ID does not exist" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.879200 5025 scope.go:117] "RemoveContainer" containerID="6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed" Oct 07 08:52:11 crc kubenswrapper[5025]: E1007 08:52:11.879578 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed\": container with ID starting with 6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed not found: ID does not exist" containerID="6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.879616 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed"} err="failed to get container status \"6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed\": rpc error: code = NotFound desc = could not find container \"6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed\": container with ID starting with 6f75c388cda9a484e5da9d8b6eb548d4dc113f5d2cdda2a49a1001d918d567ed not found: ID does not exist" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.906009 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8bb29f5-c5f9-4068-95a3-2723e439c8b6" (UID: "a8bb29f5-c5f9-4068-95a3-2723e439c8b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.959318 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.959361 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:52:11 crc kubenswrapper[5025]: I1007 08:52:11.959376 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm8kj\" (UniqueName: \"kubernetes.io/projected/a8bb29f5-c5f9-4068-95a3-2723e439c8b6-kube-api-access-xm8kj\") on node \"crc\" DevicePath \"\"" Oct 07 08:52:12 crc kubenswrapper[5025]: I1007 08:52:12.127128 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l5jcs"] Oct 07 08:52:12 crc kubenswrapper[5025]: I1007 08:52:12.135058 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l5jcs"] Oct 07 08:52:13 crc kubenswrapper[5025]: I1007 08:52:13.928521 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" path="/var/lib/kubelet/pods/a8bb29f5-c5f9-4068-95a3-2723e439c8b6/volumes" Oct 07 08:52:25 crc kubenswrapper[5025]: I1007 08:52:25.934300 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:52:25 crc kubenswrapper[5025]: I1007 08:52:25.935252 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.085573 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rw2nf"] Oct 07 08:52:35 crc kubenswrapper[5025]: E1007 08:52:35.087701 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerName="registry-server" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.087748 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerName="registry-server" Oct 07 08:52:35 crc kubenswrapper[5025]: E1007 08:52:35.087785 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerName="extract-utilities" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.087839 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerName="extract-utilities" Oct 07 08:52:35 crc kubenswrapper[5025]: E1007 08:52:35.087860 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerName="extract-content" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.087869 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerName="extract-content" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.088126 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8bb29f5-c5f9-4068-95a3-2723e439c8b6" containerName="registry-server" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.092244 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.096457 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rw2nf"] Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.156554 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f79kk\" (UniqueName: \"kubernetes.io/projected/55417f78-be4d-4d4a-b65d-5b8e223aaedd-kube-api-access-f79kk\") pod \"redhat-marketplace-rw2nf\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.156639 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-utilities\") pod \"redhat-marketplace-rw2nf\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.156688 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-catalog-content\") pod \"redhat-marketplace-rw2nf\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.258286 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f79kk\" (UniqueName: \"kubernetes.io/projected/55417f78-be4d-4d4a-b65d-5b8e223aaedd-kube-api-access-f79kk\") pod \"redhat-marketplace-rw2nf\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.258358 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-utilities\") pod \"redhat-marketplace-rw2nf\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.258387 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-catalog-content\") pod \"redhat-marketplace-rw2nf\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.258934 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-catalog-content\") pod \"redhat-marketplace-rw2nf\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.259151 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-utilities\") pod \"redhat-marketplace-rw2nf\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.282415 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f79kk\" (UniqueName: \"kubernetes.io/projected/55417f78-be4d-4d4a-b65d-5b8e223aaedd-kube-api-access-f79kk\") pod \"redhat-marketplace-rw2nf\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.416641 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.891880 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rw2nf"] Oct 07 08:52:35 crc kubenswrapper[5025]: I1007 08:52:35.999022 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rw2nf" event={"ID":"55417f78-be4d-4d4a-b65d-5b8e223aaedd","Type":"ContainerStarted","Data":"23cae73ff11fa580e8ba71d4cd8762382b88d5a4934cc2dfb6adefb6808a6eb7"} Oct 07 08:52:37 crc kubenswrapper[5025]: I1007 08:52:37.007279 5025 generic.go:334] "Generic (PLEG): container finished" podID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerID="bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127" exitCode=0 Oct 07 08:52:37 crc kubenswrapper[5025]: I1007 08:52:37.007345 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rw2nf" event={"ID":"55417f78-be4d-4d4a-b65d-5b8e223aaedd","Type":"ContainerDied","Data":"bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127"} Oct 07 08:52:38 crc kubenswrapper[5025]: I1007 08:52:38.018328 5025 generic.go:334] "Generic (PLEG): container finished" podID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerID="8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2" exitCode=0 Oct 07 08:52:38 crc kubenswrapper[5025]: I1007 08:52:38.018422 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rw2nf" event={"ID":"55417f78-be4d-4d4a-b65d-5b8e223aaedd","Type":"ContainerDied","Data":"8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2"} Oct 07 08:52:39 crc kubenswrapper[5025]: I1007 08:52:39.028986 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rw2nf" event={"ID":"55417f78-be4d-4d4a-b65d-5b8e223aaedd","Type":"ContainerStarted","Data":"b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a"} Oct 07 08:52:39 crc kubenswrapper[5025]: I1007 08:52:39.056053 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rw2nf" podStartSLOduration=2.469156464 podStartE2EDuration="4.056013047s" podCreationTimestamp="2025-10-07 08:52:35 +0000 UTC" firstStartedPulling="2025-10-07 08:52:37.01015335 +0000 UTC m=+2163.819467494" lastFinishedPulling="2025-10-07 08:52:38.597009933 +0000 UTC m=+2165.406324077" observedRunningTime="2025-10-07 08:52:39.055289425 +0000 UTC m=+2165.864603569" watchObservedRunningTime="2025-10-07 08:52:39.056013047 +0000 UTC m=+2165.865327191" Oct 07 08:52:45 crc kubenswrapper[5025]: I1007 08:52:45.417447 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:45 crc kubenswrapper[5025]: I1007 08:52:45.417901 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:45 crc kubenswrapper[5025]: I1007 08:52:45.460630 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:46 crc kubenswrapper[5025]: I1007 08:52:46.122515 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:46 crc kubenswrapper[5025]: I1007 08:52:46.170935 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rw2nf"] Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.094570 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rw2nf" podUID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerName="registry-server" containerID="cri-o://b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a" gracePeriod=2 Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.460118 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.577071 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-utilities\") pod \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.577174 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-catalog-content\") pod \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.577204 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f79kk\" (UniqueName: \"kubernetes.io/projected/55417f78-be4d-4d4a-b65d-5b8e223aaedd-kube-api-access-f79kk\") pod \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\" (UID: \"55417f78-be4d-4d4a-b65d-5b8e223aaedd\") " Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.578359 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-utilities" (OuterVolumeSpecName: "utilities") pod "55417f78-be4d-4d4a-b65d-5b8e223aaedd" (UID: "55417f78-be4d-4d4a-b65d-5b8e223aaedd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.583299 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55417f78-be4d-4d4a-b65d-5b8e223aaedd-kube-api-access-f79kk" (OuterVolumeSpecName: "kube-api-access-f79kk") pod "55417f78-be4d-4d4a-b65d-5b8e223aaedd" (UID: "55417f78-be4d-4d4a-b65d-5b8e223aaedd"). InnerVolumeSpecName "kube-api-access-f79kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.590852 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55417f78-be4d-4d4a-b65d-5b8e223aaedd" (UID: "55417f78-be4d-4d4a-b65d-5b8e223aaedd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.678516 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.678850 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55417f78-be4d-4d4a-b65d-5b8e223aaedd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:52:48 crc kubenswrapper[5025]: I1007 08:52:48.678915 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f79kk\" (UniqueName: \"kubernetes.io/projected/55417f78-be4d-4d4a-b65d-5b8e223aaedd-kube-api-access-f79kk\") on node \"crc\" DevicePath \"\"" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.103117 5025 generic.go:334] "Generic (PLEG): container finished" podID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerID="b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a" exitCode=0 Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.103172 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rw2nf" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.103171 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rw2nf" event={"ID":"55417f78-be4d-4d4a-b65d-5b8e223aaedd","Type":"ContainerDied","Data":"b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a"} Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.103236 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rw2nf" event={"ID":"55417f78-be4d-4d4a-b65d-5b8e223aaedd","Type":"ContainerDied","Data":"23cae73ff11fa580e8ba71d4cd8762382b88d5a4934cc2dfb6adefb6808a6eb7"} Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.103259 5025 scope.go:117] "RemoveContainer" containerID="b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.136151 5025 scope.go:117] "RemoveContainer" containerID="8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.138683 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rw2nf"] Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.145802 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rw2nf"] Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.153206 5025 scope.go:117] "RemoveContainer" containerID="bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.180799 5025 scope.go:117] "RemoveContainer" containerID="b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a" Oct 07 08:52:49 crc kubenswrapper[5025]: E1007 08:52:49.181366 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a\": container with ID starting with b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a not found: ID does not exist" containerID="b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.181411 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a"} err="failed to get container status \"b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a\": rpc error: code = NotFound desc = could not find container \"b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a\": container with ID starting with b09ac4073e2df680875f09fcb71cd45fdab3522b4f9846be799e55d8be48c42a not found: ID does not exist" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.181451 5025 scope.go:117] "RemoveContainer" containerID="8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2" Oct 07 08:52:49 crc kubenswrapper[5025]: E1007 08:52:49.182030 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2\": container with ID starting with 8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2 not found: ID does not exist" containerID="8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.182065 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2"} err="failed to get container status \"8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2\": rpc error: code = NotFound desc = could not find container \"8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2\": container with ID starting with 8a838da3f8f9b56c274ff0b6285d72b5f8f81f37dff797cc8d27d3a223e5a4e2 not found: ID does not exist" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.182086 5025 scope.go:117] "RemoveContainer" containerID="bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127" Oct 07 08:52:49 crc kubenswrapper[5025]: E1007 08:52:49.182518 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127\": container with ID starting with bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127 not found: ID does not exist" containerID="bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.182653 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127"} err="failed to get container status \"bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127\": rpc error: code = NotFound desc = could not find container \"bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127\": container with ID starting with bed18dd90c807801417c2843ba83e2777577f1db4885fa4248fa7f52dcfe1127 not found: ID does not exist" Oct 07 08:52:49 crc kubenswrapper[5025]: I1007 08:52:49.926559 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" path="/var/lib/kubelet/pods/55417f78-be4d-4d4a-b65d-5b8e223aaedd/volumes" Oct 07 08:52:55 crc kubenswrapper[5025]: I1007 08:52:55.934076 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:52:55 crc kubenswrapper[5025]: I1007 08:52:55.934666 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:52:55 crc kubenswrapper[5025]: I1007 08:52:55.934717 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:52:55 crc kubenswrapper[5025]: I1007 08:52:55.935347 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"013eb7ba27e6f88cb2743b9c7df2653632623e0741c302a71b20efaf02f2635f"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:52:55 crc kubenswrapper[5025]: I1007 08:52:55.935397 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://013eb7ba27e6f88cb2743b9c7df2653632623e0741c302a71b20efaf02f2635f" gracePeriod=600 Oct 07 08:52:56 crc kubenswrapper[5025]: I1007 08:52:56.184251 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="013eb7ba27e6f88cb2743b9c7df2653632623e0741c302a71b20efaf02f2635f" exitCode=0 Oct 07 08:52:56 crc kubenswrapper[5025]: I1007 08:52:56.184326 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"013eb7ba27e6f88cb2743b9c7df2653632623e0741c302a71b20efaf02f2635f"} Oct 07 08:52:56 crc kubenswrapper[5025]: I1007 08:52:56.184728 5025 scope.go:117] "RemoveContainer" containerID="61f0741cda050d253e1d966c022f5b530b9f0b20bc1389a3192197981eacfef9" Oct 07 08:52:57 crc kubenswrapper[5025]: I1007 08:52:57.195417 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a"} Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.755419 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ng87c"] Oct 07 08:53:45 crc kubenswrapper[5025]: E1007 08:53:45.756431 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerName="extract-content" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.756446 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerName="extract-content" Oct 07 08:53:45 crc kubenswrapper[5025]: E1007 08:53:45.756463 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerName="registry-server" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.756469 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerName="registry-server" Oct 07 08:53:45 crc kubenswrapper[5025]: E1007 08:53:45.756477 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerName="extract-utilities" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.756483 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerName="extract-utilities" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.756678 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="55417f78-be4d-4d4a-b65d-5b8e223aaedd" containerName="registry-server" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.757928 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.767404 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng87c"] Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.881785 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-utilities\") pod \"redhat-operators-ng87c\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.881848 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-catalog-content\") pod \"redhat-operators-ng87c\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.882224 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hkl8\" (UniqueName: \"kubernetes.io/projected/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-kube-api-access-9hkl8\") pod \"redhat-operators-ng87c\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.984095 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-utilities\") pod \"redhat-operators-ng87c\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.984167 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-catalog-content\") pod \"redhat-operators-ng87c\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.984210 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hkl8\" (UniqueName: \"kubernetes.io/projected/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-kube-api-access-9hkl8\") pod \"redhat-operators-ng87c\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.984670 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-utilities\") pod \"redhat-operators-ng87c\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:45 crc kubenswrapper[5025]: I1007 08:53:45.984962 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-catalog-content\") pod \"redhat-operators-ng87c\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:46 crc kubenswrapper[5025]: I1007 08:53:46.010637 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hkl8\" (UniqueName: \"kubernetes.io/projected/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-kube-api-access-9hkl8\") pod \"redhat-operators-ng87c\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:46 crc kubenswrapper[5025]: I1007 08:53:46.089158 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:46 crc kubenswrapper[5025]: I1007 08:53:46.517381 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng87c"] Oct 07 08:53:46 crc kubenswrapper[5025]: W1007 08:53:46.524473 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod591e97f7_f7ba_4c7c_9b8d_ce7c315a1f54.slice/crio-7f19d332ed0d3f22f34f240b5200d5ab14107de45cca0d4c4c2371c9c291a464 WatchSource:0}: Error finding container 7f19d332ed0d3f22f34f240b5200d5ab14107de45cca0d4c4c2371c9c291a464: Status 404 returned error can't find the container with id 7f19d332ed0d3f22f34f240b5200d5ab14107de45cca0d4c4c2371c9c291a464 Oct 07 08:53:46 crc kubenswrapper[5025]: I1007 08:53:46.577241 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng87c" event={"ID":"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54","Type":"ContainerStarted","Data":"7f19d332ed0d3f22f34f240b5200d5ab14107de45cca0d4c4c2371c9c291a464"} Oct 07 08:53:47 crc kubenswrapper[5025]: I1007 08:53:47.585856 5025 generic.go:334] "Generic (PLEG): container finished" podID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerID="fc2b81f78a7371914005a1f32077998967154df30c435e665bb3233bb027e28c" exitCode=0 Oct 07 08:53:47 crc kubenswrapper[5025]: I1007 08:53:47.585943 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng87c" event={"ID":"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54","Type":"ContainerDied","Data":"fc2b81f78a7371914005a1f32077998967154df30c435e665bb3233bb027e28c"} Oct 07 08:53:54 crc kubenswrapper[5025]: I1007 08:53:54.644581 5025 generic.go:334] "Generic (PLEG): container finished" podID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerID="6732473498fb506b3400b1d03a9558bfc813a268ebbdb3d50f0002d389831a5a" exitCode=0 Oct 07 08:53:54 crc kubenswrapper[5025]: I1007 08:53:54.644613 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng87c" event={"ID":"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54","Type":"ContainerDied","Data":"6732473498fb506b3400b1d03a9558bfc813a268ebbdb3d50f0002d389831a5a"} Oct 07 08:53:55 crc kubenswrapper[5025]: I1007 08:53:55.656595 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng87c" event={"ID":"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54","Type":"ContainerStarted","Data":"ccaeec9b9484abdc4b19621312fa623e7672c2ac2513ce1aff01bbd14d63ce93"} Oct 07 08:53:55 crc kubenswrapper[5025]: I1007 08:53:55.685560 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ng87c" podStartSLOduration=2.9799174600000002 podStartE2EDuration="10.685522464s" podCreationTimestamp="2025-10-07 08:53:45 +0000 UTC" firstStartedPulling="2025-10-07 08:53:47.588067177 +0000 UTC m=+2234.397381321" lastFinishedPulling="2025-10-07 08:53:55.293672181 +0000 UTC m=+2242.102986325" observedRunningTime="2025-10-07 08:53:55.684919135 +0000 UTC m=+2242.494233279" watchObservedRunningTime="2025-10-07 08:53:55.685522464 +0000 UTC m=+2242.494836608" Oct 07 08:53:56 crc kubenswrapper[5025]: I1007 08:53:56.089988 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:56 crc kubenswrapper[5025]: I1007 08:53:56.090353 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:53:57 crc kubenswrapper[5025]: I1007 08:53:57.129004 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ng87c" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerName="registry-server" probeResult="failure" output=< Oct 07 08:53:57 crc kubenswrapper[5025]: timeout: failed to connect service ":50051" within 1s Oct 07 08:53:57 crc kubenswrapper[5025]: > Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.138777 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.204585 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.275569 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng87c"] Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.379223 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnq7n"] Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.379643 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dnq7n" podUID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerName="registry-server" containerID="cri-o://fd9b6ed71a60034073e39040b84d7c684481493f9f21c19db28b2d0878e76299" gracePeriod=2 Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.743057 5025 generic.go:334] "Generic (PLEG): container finished" podID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerID="fd9b6ed71a60034073e39040b84d7c684481493f9f21c19db28b2d0878e76299" exitCode=0 Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.743349 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnq7n" event={"ID":"c01dd09b-1dca-427d-a45f-9a5870172dd0","Type":"ContainerDied","Data":"fd9b6ed71a60034073e39040b84d7c684481493f9f21c19db28b2d0878e76299"} Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.743647 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnq7n" event={"ID":"c01dd09b-1dca-427d-a45f-9a5870172dd0","Type":"ContainerDied","Data":"034000796bf6523ddbaa238be1984af7e975e0e34ed948b5dfdd6339a2c9c192"} Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.743666 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="034000796bf6523ddbaa238be1984af7e975e0e34ed948b5dfdd6339a2c9c192" Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.805596 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.829158 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-catalog-content\") pod \"c01dd09b-1dca-427d-a45f-9a5870172dd0\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.829377 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-utilities\") pod \"c01dd09b-1dca-427d-a45f-9a5870172dd0\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.829560 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/c01dd09b-1dca-427d-a45f-9a5870172dd0-kube-api-access-4dq87\") pod \"c01dd09b-1dca-427d-a45f-9a5870172dd0\" (UID: \"c01dd09b-1dca-427d-a45f-9a5870172dd0\") " Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.829984 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-utilities" (OuterVolumeSpecName: "utilities") pod "c01dd09b-1dca-427d-a45f-9a5870172dd0" (UID: "c01dd09b-1dca-427d-a45f-9a5870172dd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.848392 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01dd09b-1dca-427d-a45f-9a5870172dd0-kube-api-access-4dq87" (OuterVolumeSpecName: "kube-api-access-4dq87") pod "c01dd09b-1dca-427d-a45f-9a5870172dd0" (UID: "c01dd09b-1dca-427d-a45f-9a5870172dd0"). InnerVolumeSpecName "kube-api-access-4dq87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.910986 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c01dd09b-1dca-427d-a45f-9a5870172dd0" (UID: "c01dd09b-1dca-427d-a45f-9a5870172dd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.931813 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dq87\" (UniqueName: \"kubernetes.io/projected/c01dd09b-1dca-427d-a45f-9a5870172dd0-kube-api-access-4dq87\") on node \"crc\" DevicePath \"\"" Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.931851 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 08:54:06 crc kubenswrapper[5025]: I1007 08:54:06.931864 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01dd09b-1dca-427d-a45f-9a5870172dd0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 08:54:07 crc kubenswrapper[5025]: I1007 08:54:07.752499 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnq7n" Oct 07 08:54:07 crc kubenswrapper[5025]: I1007 08:54:07.792072 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnq7n"] Oct 07 08:54:07 crc kubenswrapper[5025]: I1007 08:54:07.796880 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dnq7n"] Oct 07 08:54:07 crc kubenswrapper[5025]: I1007 08:54:07.925258 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c01dd09b-1dca-427d-a45f-9a5870172dd0" path="/var/lib/kubelet/pods/c01dd09b-1dca-427d-a45f-9a5870172dd0/volumes" Oct 07 08:54:38 crc kubenswrapper[5025]: I1007 08:54:38.700025 5025 scope.go:117] "RemoveContainer" containerID="fd9b6ed71a60034073e39040b84d7c684481493f9f21c19db28b2d0878e76299" Oct 07 08:54:38 crc kubenswrapper[5025]: I1007 08:54:38.724339 5025 scope.go:117] "RemoveContainer" containerID="06157688eb78a403caa0c5d3cc333bf2c783b4e2c2e88e234a6b37e87a1d9d80" Oct 07 08:54:38 crc kubenswrapper[5025]: I1007 08:54:38.745096 5025 scope.go:117] "RemoveContainer" containerID="c4939ec0839b987c68365e4f55cd1d093fe6639dd97e580c0c2ac2995e9b65be" Oct 07 08:55:25 crc kubenswrapper[5025]: I1007 08:55:25.934834 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:55:25 crc kubenswrapper[5025]: I1007 08:55:25.935705 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:55:55 crc kubenswrapper[5025]: I1007 08:55:55.934877 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:55:55 crc kubenswrapper[5025]: I1007 08:55:55.935830 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:56:25 crc kubenswrapper[5025]: I1007 08:56:25.934431 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 08:56:25 crc kubenswrapper[5025]: I1007 08:56:25.935252 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 08:56:25 crc kubenswrapper[5025]: I1007 08:56:25.935378 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 08:56:25 crc kubenswrapper[5025]: I1007 08:56:25.936390 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 08:56:25 crc kubenswrapper[5025]: I1007 08:56:25.936473 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" gracePeriod=600 Oct 07 08:56:26 crc kubenswrapper[5025]: E1007 08:56:26.088789 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:56:26 crc kubenswrapper[5025]: I1007 08:56:26.864978 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" exitCode=0 Oct 07 08:56:26 crc kubenswrapper[5025]: I1007 08:56:26.865120 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a"} Oct 07 08:56:26 crc kubenswrapper[5025]: I1007 08:56:26.865610 5025 scope.go:117] "RemoveContainer" containerID="013eb7ba27e6f88cb2743b9c7df2653632623e0741c302a71b20efaf02f2635f" Oct 07 08:56:26 crc kubenswrapper[5025]: I1007 08:56:26.867776 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:56:26 crc kubenswrapper[5025]: E1007 08:56:26.869293 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:56:38 crc kubenswrapper[5025]: I1007 08:56:38.914789 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:56:38 crc kubenswrapper[5025]: E1007 08:56:38.915663 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:56:52 crc kubenswrapper[5025]: I1007 08:56:52.915054 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:56:52 crc kubenswrapper[5025]: E1007 08:56:52.916772 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:57:06 crc kubenswrapper[5025]: I1007 08:57:06.915421 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:57:06 crc kubenswrapper[5025]: E1007 08:57:06.918199 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:57:20 crc kubenswrapper[5025]: I1007 08:57:20.915079 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:57:20 crc kubenswrapper[5025]: E1007 08:57:20.916246 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:57:35 crc kubenswrapper[5025]: I1007 08:57:35.915977 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:57:35 crc kubenswrapper[5025]: E1007 08:57:35.917324 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:57:49 crc kubenswrapper[5025]: I1007 08:57:49.915327 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:57:49 crc kubenswrapper[5025]: E1007 08:57:49.916582 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:58:01 crc kubenswrapper[5025]: I1007 08:58:01.915616 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:58:01 crc kubenswrapper[5025]: E1007 08:58:01.916406 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:58:16 crc kubenswrapper[5025]: I1007 08:58:16.915274 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:58:16 crc kubenswrapper[5025]: E1007 08:58:16.916088 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:58:28 crc kubenswrapper[5025]: I1007 08:58:28.914617 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:58:28 crc kubenswrapper[5025]: E1007 08:58:28.915293 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:58:41 crc kubenswrapper[5025]: I1007 08:58:41.914022 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:58:41 crc kubenswrapper[5025]: E1007 08:58:41.915899 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:58:54 crc kubenswrapper[5025]: I1007 08:58:54.915165 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:58:54 crc kubenswrapper[5025]: E1007 08:58:54.916141 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:59:09 crc kubenswrapper[5025]: I1007 08:59:09.918358 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:59:09 crc kubenswrapper[5025]: E1007 08:59:09.919498 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:59:24 crc kubenswrapper[5025]: I1007 08:59:24.914859 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:59:24 crc kubenswrapper[5025]: E1007 08:59:24.915795 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:59:36 crc kubenswrapper[5025]: I1007 08:59:36.915088 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:59:36 crc kubenswrapper[5025]: E1007 08:59:36.916261 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:59:48 crc kubenswrapper[5025]: I1007 08:59:48.915749 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:59:48 crc kubenswrapper[5025]: E1007 08:59:48.916635 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 08:59:59 crc kubenswrapper[5025]: I1007 08:59:59.914341 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 08:59:59 crc kubenswrapper[5025]: E1007 08:59:59.915110 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.149128 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4"] Oct 07 09:00:00 crc kubenswrapper[5025]: E1007 09:00:00.149853 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerName="registry-server" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.149871 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerName="registry-server" Oct 07 09:00:00 crc kubenswrapper[5025]: E1007 09:00:00.149883 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerName="extract-utilities" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.149890 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerName="extract-utilities" Oct 07 09:00:00 crc kubenswrapper[5025]: E1007 09:00:00.149913 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerName="extract-content" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.149920 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerName="extract-content" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.150064 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01dd09b-1dca-427d-a45f-9a5870172dd0" containerName="registry-server" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.150798 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.153299 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.153877 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.156895 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4"] Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.280579 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95a9558a-7fca-4f45-a15e-4da0737a2a0e-secret-volume\") pod \"collect-profiles-29330460-v6nc4\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.280667 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rqms\" (UniqueName: \"kubernetes.io/projected/95a9558a-7fca-4f45-a15e-4da0737a2a0e-kube-api-access-4rqms\") pod \"collect-profiles-29330460-v6nc4\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.280766 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95a9558a-7fca-4f45-a15e-4da0737a2a0e-config-volume\") pod \"collect-profiles-29330460-v6nc4\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.381573 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rqms\" (UniqueName: \"kubernetes.io/projected/95a9558a-7fca-4f45-a15e-4da0737a2a0e-kube-api-access-4rqms\") pod \"collect-profiles-29330460-v6nc4\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.381633 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95a9558a-7fca-4f45-a15e-4da0737a2a0e-config-volume\") pod \"collect-profiles-29330460-v6nc4\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.381683 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95a9558a-7fca-4f45-a15e-4da0737a2a0e-secret-volume\") pod \"collect-profiles-29330460-v6nc4\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.383030 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95a9558a-7fca-4f45-a15e-4da0737a2a0e-config-volume\") pod \"collect-profiles-29330460-v6nc4\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.389578 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95a9558a-7fca-4f45-a15e-4da0737a2a0e-secret-volume\") pod \"collect-profiles-29330460-v6nc4\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.400234 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rqms\" (UniqueName: \"kubernetes.io/projected/95a9558a-7fca-4f45-a15e-4da0737a2a0e-kube-api-access-4rqms\") pod \"collect-profiles-29330460-v6nc4\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.474318 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:00 crc kubenswrapper[5025]: I1007 09:00:00.702822 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4"] Oct 07 09:00:01 crc kubenswrapper[5025]: I1007 09:00:01.619243 5025 generic.go:334] "Generic (PLEG): container finished" podID="95a9558a-7fca-4f45-a15e-4da0737a2a0e" containerID="85cd37c653447285ca4a9b4f26959960b5e282a9410dc534ad907be0eb71dbb6" exitCode=0 Oct 07 09:00:01 crc kubenswrapper[5025]: I1007 09:00:01.619383 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" event={"ID":"95a9558a-7fca-4f45-a15e-4da0737a2a0e","Type":"ContainerDied","Data":"85cd37c653447285ca4a9b4f26959960b5e282a9410dc534ad907be0eb71dbb6"} Oct 07 09:00:01 crc kubenswrapper[5025]: I1007 09:00:01.619656 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" event={"ID":"95a9558a-7fca-4f45-a15e-4da0737a2a0e","Type":"ContainerStarted","Data":"31f73255e2661c00afe548cc27dfffe73092365a779a564573aba47b4a377545"} Oct 07 09:00:02 crc kubenswrapper[5025]: I1007 09:00:02.894507 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:02 crc kubenswrapper[5025]: I1007 09:00:02.926204 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95a9558a-7fca-4f45-a15e-4da0737a2a0e-config-volume\") pod \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " Oct 07 09:00:02 crc kubenswrapper[5025]: I1007 09:00:02.926262 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95a9558a-7fca-4f45-a15e-4da0737a2a0e-secret-volume\") pod \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " Oct 07 09:00:02 crc kubenswrapper[5025]: I1007 09:00:02.926337 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rqms\" (UniqueName: \"kubernetes.io/projected/95a9558a-7fca-4f45-a15e-4da0737a2a0e-kube-api-access-4rqms\") pod \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\" (UID: \"95a9558a-7fca-4f45-a15e-4da0737a2a0e\") " Oct 07 09:00:02 crc kubenswrapper[5025]: I1007 09:00:02.927191 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a9558a-7fca-4f45-a15e-4da0737a2a0e-config-volume" (OuterVolumeSpecName: "config-volume") pod "95a9558a-7fca-4f45-a15e-4da0737a2a0e" (UID: "95a9558a-7fca-4f45-a15e-4da0737a2a0e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 09:00:02 crc kubenswrapper[5025]: I1007 09:00:02.931901 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a9558a-7fca-4f45-a15e-4da0737a2a0e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "95a9558a-7fca-4f45-a15e-4da0737a2a0e" (UID: "95a9558a-7fca-4f45-a15e-4da0737a2a0e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 09:00:02 crc kubenswrapper[5025]: I1007 09:00:02.931930 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a9558a-7fca-4f45-a15e-4da0737a2a0e-kube-api-access-4rqms" (OuterVolumeSpecName: "kube-api-access-4rqms") pod "95a9558a-7fca-4f45-a15e-4da0737a2a0e" (UID: "95a9558a-7fca-4f45-a15e-4da0737a2a0e"). InnerVolumeSpecName "kube-api-access-4rqms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:00:03 crc kubenswrapper[5025]: I1007 09:00:03.027851 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rqms\" (UniqueName: \"kubernetes.io/projected/95a9558a-7fca-4f45-a15e-4da0737a2a0e-kube-api-access-4rqms\") on node \"crc\" DevicePath \"\"" Oct 07 09:00:03 crc kubenswrapper[5025]: I1007 09:00:03.027886 5025 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95a9558a-7fca-4f45-a15e-4da0737a2a0e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 09:00:03 crc kubenswrapper[5025]: I1007 09:00:03.027904 5025 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95a9558a-7fca-4f45-a15e-4da0737a2a0e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 09:00:03 crc kubenswrapper[5025]: I1007 09:00:03.634813 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" event={"ID":"95a9558a-7fca-4f45-a15e-4da0737a2a0e","Type":"ContainerDied","Data":"31f73255e2661c00afe548cc27dfffe73092365a779a564573aba47b4a377545"} Oct 07 09:00:03 crc kubenswrapper[5025]: I1007 09:00:03.634854 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330460-v6nc4" Oct 07 09:00:03 crc kubenswrapper[5025]: I1007 09:00:03.634858 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f73255e2661c00afe548cc27dfffe73092365a779a564573aba47b4a377545" Oct 07 09:00:03 crc kubenswrapper[5025]: I1007 09:00:03.964075 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz"] Oct 07 09:00:03 crc kubenswrapper[5025]: I1007 09:00:03.968604 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330415-w9ctz"] Oct 07 09:00:05 crc kubenswrapper[5025]: I1007 09:00:05.924794 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385983c4-874d-4095-980c-c2d3763ce8e1" path="/var/lib/kubelet/pods/385983c4-874d-4095-980c-c2d3763ce8e1/volumes" Oct 07 09:00:12 crc kubenswrapper[5025]: I1007 09:00:12.914765 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 09:00:12 crc kubenswrapper[5025]: E1007 09:00:12.915495 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:00:25 crc kubenswrapper[5025]: I1007 09:00:25.914929 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 09:00:25 crc kubenswrapper[5025]: E1007 09:00:25.915712 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:00:38 crc kubenswrapper[5025]: I1007 09:00:38.859331 5025 scope.go:117] "RemoveContainer" containerID="7c67b4e23b0192671f5392e63c59ff39f04a074f5d8d975a0c03f4e5f8144751" Oct 07 09:00:40 crc kubenswrapper[5025]: I1007 09:00:40.915119 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 09:00:40 crc kubenswrapper[5025]: E1007 09:00:40.915603 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:00:53 crc kubenswrapper[5025]: I1007 09:00:53.924502 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 09:00:53 crc kubenswrapper[5025]: E1007 09:00:53.925417 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.267987 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t6hft"] Oct 07 09:01:08 crc kubenswrapper[5025]: E1007 09:01:08.269343 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a9558a-7fca-4f45-a15e-4da0737a2a0e" containerName="collect-profiles" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.269373 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a9558a-7fca-4f45-a15e-4da0737a2a0e" containerName="collect-profiles" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.269790 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a9558a-7fca-4f45-a15e-4da0737a2a0e" containerName="collect-profiles" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.272171 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.274767 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6hft"] Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.359487 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdd5\" (UniqueName: \"kubernetes.io/projected/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-kube-api-access-jqdd5\") pod \"certified-operators-t6hft\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.359526 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-utilities\") pod \"certified-operators-t6hft\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.359634 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-catalog-content\") pod \"certified-operators-t6hft\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.460847 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-catalog-content\") pod \"certified-operators-t6hft\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.460978 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdd5\" (UniqueName: \"kubernetes.io/projected/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-kube-api-access-jqdd5\") pod \"certified-operators-t6hft\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.461005 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-utilities\") pod \"certified-operators-t6hft\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.461767 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-utilities\") pod \"certified-operators-t6hft\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.461767 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-catalog-content\") pod \"certified-operators-t6hft\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.496321 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdd5\" (UniqueName: \"kubernetes.io/projected/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-kube-api-access-jqdd5\") pod \"certified-operators-t6hft\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.631324 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.912807 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6hft"] Oct 07 09:01:08 crc kubenswrapper[5025]: I1007 09:01:08.914982 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 09:01:08 crc kubenswrapper[5025]: E1007 09:01:08.915181 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:01:09 crc kubenswrapper[5025]: I1007 09:01:09.115650 5025 generic.go:334] "Generic (PLEG): container finished" podID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerID="cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4" exitCode=0 Oct 07 09:01:09 crc kubenswrapper[5025]: I1007 09:01:09.115691 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6hft" event={"ID":"b1aac0fd-7a5d-4781-818f-a51e3d688c5c","Type":"ContainerDied","Data":"cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4"} Oct 07 09:01:09 crc kubenswrapper[5025]: I1007 09:01:09.115738 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6hft" event={"ID":"b1aac0fd-7a5d-4781-818f-a51e3d688c5c","Type":"ContainerStarted","Data":"8a2f43bf7fcd632d3955e1b1902b8f975bc7ed6d56471bbd149ec807eb8acddb"} Oct 07 09:01:09 crc kubenswrapper[5025]: I1007 09:01:09.117157 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 09:01:10 crc kubenswrapper[5025]: I1007 09:01:10.125916 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6hft" event={"ID":"b1aac0fd-7a5d-4781-818f-a51e3d688c5c","Type":"ContainerStarted","Data":"f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19"} Oct 07 09:01:11 crc kubenswrapper[5025]: I1007 09:01:11.134866 5025 generic.go:334] "Generic (PLEG): container finished" podID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerID="f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19" exitCode=0 Oct 07 09:01:11 crc kubenswrapper[5025]: I1007 09:01:11.134913 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6hft" event={"ID":"b1aac0fd-7a5d-4781-818f-a51e3d688c5c","Type":"ContainerDied","Data":"f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19"} Oct 07 09:01:12 crc kubenswrapper[5025]: I1007 09:01:12.143948 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6hft" event={"ID":"b1aac0fd-7a5d-4781-818f-a51e3d688c5c","Type":"ContainerStarted","Data":"e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503"} Oct 07 09:01:18 crc kubenswrapper[5025]: I1007 09:01:18.632718 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:18 crc kubenswrapper[5025]: I1007 09:01:18.633718 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:18 crc kubenswrapper[5025]: I1007 09:01:18.692566 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:18 crc kubenswrapper[5025]: I1007 09:01:18.719861 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t6hft" podStartSLOduration=8.153220445 podStartE2EDuration="10.719841195s" podCreationTimestamp="2025-10-07 09:01:08 +0000 UTC" firstStartedPulling="2025-10-07 09:01:09.116892707 +0000 UTC m=+2675.926206841" lastFinishedPulling="2025-10-07 09:01:11.683513447 +0000 UTC m=+2678.492827591" observedRunningTime="2025-10-07 09:01:12.163970344 +0000 UTC m=+2678.973284528" watchObservedRunningTime="2025-10-07 09:01:18.719841195 +0000 UTC m=+2685.529155349" Oct 07 09:01:19 crc kubenswrapper[5025]: I1007 09:01:19.240690 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:19 crc kubenswrapper[5025]: I1007 09:01:19.300150 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6hft"] Oct 07 09:01:19 crc kubenswrapper[5025]: I1007 09:01:19.915128 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 09:01:19 crc kubenswrapper[5025]: E1007 09:01:19.915365 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.210979 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t6hft" podUID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerName="registry-server" containerID="cri-o://e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503" gracePeriod=2 Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.597533 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.672872 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-catalog-content\") pod \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.672952 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-utilities\") pod \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.672990 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqdd5\" (UniqueName: \"kubernetes.io/projected/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-kube-api-access-jqdd5\") pod \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\" (UID: \"b1aac0fd-7a5d-4781-818f-a51e3d688c5c\") " Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.675977 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-utilities" (OuterVolumeSpecName: "utilities") pod "b1aac0fd-7a5d-4781-818f-a51e3d688c5c" (UID: "b1aac0fd-7a5d-4781-818f-a51e3d688c5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.686767 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-kube-api-access-jqdd5" (OuterVolumeSpecName: "kube-api-access-jqdd5") pod "b1aac0fd-7a5d-4781-818f-a51e3d688c5c" (UID: "b1aac0fd-7a5d-4781-818f-a51e3d688c5c"). InnerVolumeSpecName "kube-api-access-jqdd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.725328 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1aac0fd-7a5d-4781-818f-a51e3d688c5c" (UID: "b1aac0fd-7a5d-4781-818f-a51e3d688c5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.775461 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqdd5\" (UniqueName: \"kubernetes.io/projected/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-kube-api-access-jqdd5\") on node \"crc\" DevicePath \"\"" Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.775521 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:01:21 crc kubenswrapper[5025]: I1007 09:01:21.775537 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1aac0fd-7a5d-4781-818f-a51e3d688c5c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.219421 5025 generic.go:334] "Generic (PLEG): container finished" podID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerID="e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503" exitCode=0 Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.219502 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6hft" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.219523 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6hft" event={"ID":"b1aac0fd-7a5d-4781-818f-a51e3d688c5c","Type":"ContainerDied","Data":"e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503"} Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.219973 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6hft" event={"ID":"b1aac0fd-7a5d-4781-818f-a51e3d688c5c","Type":"ContainerDied","Data":"8a2f43bf7fcd632d3955e1b1902b8f975bc7ed6d56471bbd149ec807eb8acddb"} Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.220013 5025 scope.go:117] "RemoveContainer" containerID="e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.243513 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6hft"] Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.251314 5025 scope.go:117] "RemoveContainer" containerID="f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.255151 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t6hft"] Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.267193 5025 scope.go:117] "RemoveContainer" containerID="cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.293865 5025 scope.go:117] "RemoveContainer" containerID="e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503" Oct 07 09:01:22 crc kubenswrapper[5025]: E1007 09:01:22.294268 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503\": container with ID starting with e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503 not found: ID does not exist" containerID="e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.294299 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503"} err="failed to get container status \"e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503\": rpc error: code = NotFound desc = could not find container \"e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503\": container with ID starting with e76ac4a94a425d10824516e39c7a50ddef8b463643addf470301e51e5fb59503 not found: ID does not exist" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.294318 5025 scope.go:117] "RemoveContainer" containerID="f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19" Oct 07 09:01:22 crc kubenswrapper[5025]: E1007 09:01:22.294632 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19\": container with ID starting with f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19 not found: ID does not exist" containerID="f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.294658 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19"} err="failed to get container status \"f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19\": rpc error: code = NotFound desc = could not find container \"f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19\": container with ID starting with f8cb1e6a4414c2807010e0548a530857ed202fca20d5363a40a32e85b733df19 not found: ID does not exist" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.294675 5025 scope.go:117] "RemoveContainer" containerID="cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4" Oct 07 09:01:22 crc kubenswrapper[5025]: E1007 09:01:22.295192 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4\": container with ID starting with cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4 not found: ID does not exist" containerID="cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4" Oct 07 09:01:22 crc kubenswrapper[5025]: I1007 09:01:22.295215 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4"} err="failed to get container status \"cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4\": rpc error: code = NotFound desc = could not find container \"cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4\": container with ID starting with cda8bb16768da822bdee3059a263770f10ffc8fb9e371cf9206eae584cf689c4 not found: ID does not exist" Oct 07 09:01:23 crc kubenswrapper[5025]: I1007 09:01:23.926950 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" path="/var/lib/kubelet/pods/b1aac0fd-7a5d-4781-818f-a51e3d688c5c/volumes" Oct 07 09:01:33 crc kubenswrapper[5025]: I1007 09:01:33.918648 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 09:01:34 crc kubenswrapper[5025]: I1007 09:01:34.311963 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"fe800c18a8d341f800042ac77b698eed48f45b43ae3d96ec5c15efb400a54864"} Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.024404 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ssb8r"] Oct 07 09:02:54 crc kubenswrapper[5025]: E1007 09:02:54.025353 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerName="extract-content" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.025366 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerName="extract-content" Oct 07 09:02:54 crc kubenswrapper[5025]: E1007 09:02:54.025392 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerName="registry-server" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.025398 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerName="registry-server" Oct 07 09:02:54 crc kubenswrapper[5025]: E1007 09:02:54.025420 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerName="extract-utilities" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.025433 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerName="extract-utilities" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.025626 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1aac0fd-7a5d-4781-818f-a51e3d688c5c" containerName="registry-server" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.026627 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.039955 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssb8r"] Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.172068 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzlzh\" (UniqueName: \"kubernetes.io/projected/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-kube-api-access-tzlzh\") pod \"redhat-marketplace-ssb8r\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.172110 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-catalog-content\") pod \"redhat-marketplace-ssb8r\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.172136 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-utilities\") pod \"redhat-marketplace-ssb8r\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.273287 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzlzh\" (UniqueName: \"kubernetes.io/projected/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-kube-api-access-tzlzh\") pod \"redhat-marketplace-ssb8r\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.273334 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-catalog-content\") pod \"redhat-marketplace-ssb8r\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.273360 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-utilities\") pod \"redhat-marketplace-ssb8r\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.273912 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-utilities\") pod \"redhat-marketplace-ssb8r\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.274034 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-catalog-content\") pod \"redhat-marketplace-ssb8r\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.292824 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzlzh\" (UniqueName: \"kubernetes.io/projected/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-kube-api-access-tzlzh\") pod \"redhat-marketplace-ssb8r\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.352063 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.598251 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssb8r"] Oct 07 09:02:54 crc kubenswrapper[5025]: W1007 09:02:54.614558 5025 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ceafb33_3ab7_4190_bd9d_d21cb3ac0755.slice/crio-716c043756743af7a7ccc836e234edf471ff8069bc2d812bd426935ece368aa3 WatchSource:0}: Error finding container 716c043756743af7a7ccc836e234edf471ff8069bc2d812bd426935ece368aa3: Status 404 returned error can't find the container with id 716c043756743af7a7ccc836e234edf471ff8069bc2d812bd426935ece368aa3 Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.979283 5025 generic.go:334] "Generic (PLEG): container finished" podID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerID="c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29" exitCode=0 Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.979336 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssb8r" event={"ID":"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755","Type":"ContainerDied","Data":"c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29"} Oct 07 09:02:54 crc kubenswrapper[5025]: I1007 09:02:54.979367 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssb8r" event={"ID":"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755","Type":"ContainerStarted","Data":"716c043756743af7a7ccc836e234edf471ff8069bc2d812bd426935ece368aa3"} Oct 07 09:02:55 crc kubenswrapper[5025]: I1007 09:02:55.990518 5025 generic.go:334] "Generic (PLEG): container finished" podID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerID="d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54" exitCode=0 Oct 07 09:02:55 crc kubenswrapper[5025]: I1007 09:02:55.990865 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssb8r" event={"ID":"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755","Type":"ContainerDied","Data":"d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54"} Oct 07 09:02:56 crc kubenswrapper[5025]: I1007 09:02:56.998392 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssb8r" event={"ID":"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755","Type":"ContainerStarted","Data":"cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43"} Oct 07 09:02:57 crc kubenswrapper[5025]: I1007 09:02:57.018290 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ssb8r" podStartSLOduration=1.455254369 podStartE2EDuration="3.018270066s" podCreationTimestamp="2025-10-07 09:02:54 +0000 UTC" firstStartedPulling="2025-10-07 09:02:54.980859276 +0000 UTC m=+2781.790173430" lastFinishedPulling="2025-10-07 09:02:56.543874973 +0000 UTC m=+2783.353189127" observedRunningTime="2025-10-07 09:02:57.015027473 +0000 UTC m=+2783.824341617" watchObservedRunningTime="2025-10-07 09:02:57.018270066 +0000 UTC m=+2783.827584210" Oct 07 09:03:04 crc kubenswrapper[5025]: I1007 09:03:04.352964 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:03:04 crc kubenswrapper[5025]: I1007 09:03:04.353473 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:03:04 crc kubenswrapper[5025]: I1007 09:03:04.404002 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:03:05 crc kubenswrapper[5025]: I1007 09:03:05.129518 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:03:05 crc kubenswrapper[5025]: I1007 09:03:05.193567 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssb8r"] Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.067482 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ssb8r" podUID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerName="registry-server" containerID="cri-o://cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43" gracePeriod=2 Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.558079 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.587737 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-utilities\") pod \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.588190 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-catalog-content\") pod \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.588360 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzlzh\" (UniqueName: \"kubernetes.io/projected/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-kube-api-access-tzlzh\") pod \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\" (UID: \"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755\") " Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.589146 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-utilities" (OuterVolumeSpecName: "utilities") pod "8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" (UID: "8ceafb33-3ab7-4190-bd9d-d21cb3ac0755"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.590327 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.597859 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-kube-api-access-tzlzh" (OuterVolumeSpecName: "kube-api-access-tzlzh") pod "8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" (UID: "8ceafb33-3ab7-4190-bd9d-d21cb3ac0755"). InnerVolumeSpecName "kube-api-access-tzlzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.605457 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" (UID: "8ceafb33-3ab7-4190-bd9d-d21cb3ac0755"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.692857 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:03:07 crc kubenswrapper[5025]: I1007 09:03:07.692903 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzlzh\" (UniqueName: \"kubernetes.io/projected/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755-kube-api-access-tzlzh\") on node \"crc\" DevicePath \"\"" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.078074 5025 generic.go:334] "Generic (PLEG): container finished" podID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerID="cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43" exitCode=0 Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.078122 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssb8r" event={"ID":"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755","Type":"ContainerDied","Data":"cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43"} Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.079228 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssb8r" event={"ID":"8ceafb33-3ab7-4190-bd9d-d21cb3ac0755","Type":"ContainerDied","Data":"716c043756743af7a7ccc836e234edf471ff8069bc2d812bd426935ece368aa3"} Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.078188 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssb8r" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.079294 5025 scope.go:117] "RemoveContainer" containerID="cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.106231 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssb8r"] Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.109150 5025 scope.go:117] "RemoveContainer" containerID="d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.113835 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssb8r"] Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.128221 5025 scope.go:117] "RemoveContainer" containerID="c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.152919 5025 scope.go:117] "RemoveContainer" containerID="cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43" Oct 07 09:03:08 crc kubenswrapper[5025]: E1007 09:03:08.153268 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43\": container with ID starting with cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43 not found: ID does not exist" containerID="cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.153301 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43"} err="failed to get container status \"cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43\": rpc error: code = NotFound desc = could not find container \"cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43\": container with ID starting with cea2e58aa2fe4ee4071a894564a5a07f7f8d3844951b2ea722fd69ad0d8fba43 not found: ID does not exist" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.153326 5025 scope.go:117] "RemoveContainer" containerID="d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54" Oct 07 09:03:08 crc kubenswrapper[5025]: E1007 09:03:08.153923 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54\": container with ID starting with d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54 not found: ID does not exist" containerID="d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.153978 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54"} err="failed to get container status \"d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54\": rpc error: code = NotFound desc = could not find container \"d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54\": container with ID starting with d8e2753eb218dd4890300afc3986caa2e096fe0edae452761e8b60acbc326f54 not found: ID does not exist" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.154015 5025 scope.go:117] "RemoveContainer" containerID="c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29" Oct 07 09:03:08 crc kubenswrapper[5025]: E1007 09:03:08.154296 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29\": container with ID starting with c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29 not found: ID does not exist" containerID="c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29" Oct 07 09:03:08 crc kubenswrapper[5025]: I1007 09:03:08.154325 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29"} err="failed to get container status \"c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29\": rpc error: code = NotFound desc = could not find container \"c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29\": container with ID starting with c76f36972e93aec6a845c902d8f5eab7f780a19d14a6292e3799116fd1957a29 not found: ID does not exist" Oct 07 09:03:09 crc kubenswrapper[5025]: I1007 09:03:09.926999 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" path="/var/lib/kubelet/pods/8ceafb33-3ab7-4190-bd9d-d21cb3ac0755/volumes" Oct 07 09:03:55 crc kubenswrapper[5025]: I1007 09:03:55.934691 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:03:55 crc kubenswrapper[5025]: I1007 09:03:55.936042 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:04:25 crc kubenswrapper[5025]: I1007 09:04:25.934677 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:04:25 crc kubenswrapper[5025]: I1007 09:04:25.936814 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.085224 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8grrb"] Oct 07 09:04:51 crc kubenswrapper[5025]: E1007 09:04:51.086607 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerName="extract-content" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.086629 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerName="extract-content" Oct 07 09:04:51 crc kubenswrapper[5025]: E1007 09:04:51.086657 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerName="extract-utilities" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.086668 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerName="extract-utilities" Oct 07 09:04:51 crc kubenswrapper[5025]: E1007 09:04:51.086699 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerName="registry-server" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.086711 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerName="registry-server" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.086945 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ceafb33-3ab7-4190-bd9d-d21cb3ac0755" containerName="registry-server" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.088864 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.094668 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5lcd\" (UniqueName: \"kubernetes.io/projected/72099819-ce9e-4b04-808e-53b8cd0005a7-kube-api-access-q5lcd\") pod \"redhat-operators-8grrb\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.094808 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-utilities\") pod \"redhat-operators-8grrb\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.094898 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-catalog-content\") pod \"redhat-operators-8grrb\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.111285 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8grrb"] Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.195411 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-catalog-content\") pod \"redhat-operators-8grrb\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.195488 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5lcd\" (UniqueName: \"kubernetes.io/projected/72099819-ce9e-4b04-808e-53b8cd0005a7-kube-api-access-q5lcd\") pod \"redhat-operators-8grrb\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.195571 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-utilities\") pod \"redhat-operators-8grrb\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.196098 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-utilities\") pod \"redhat-operators-8grrb\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.197317 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-catalog-content\") pod \"redhat-operators-8grrb\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.220673 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5lcd\" (UniqueName: \"kubernetes.io/projected/72099819-ce9e-4b04-808e-53b8cd0005a7-kube-api-access-q5lcd\") pod \"redhat-operators-8grrb\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.423399 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.730900 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8grrb"] Oct 07 09:04:51 crc kubenswrapper[5025]: I1007 09:04:51.867788 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8grrb" event={"ID":"72099819-ce9e-4b04-808e-53b8cd0005a7","Type":"ContainerStarted","Data":"2c28c66ceb83ef8fabf2f17448a1e4504185d016086f89686ee2281ed50477c1"} Oct 07 09:04:52 crc kubenswrapper[5025]: I1007 09:04:52.876845 5025 generic.go:334] "Generic (PLEG): container finished" podID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerID="cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf" exitCode=0 Oct 07 09:04:52 crc kubenswrapper[5025]: I1007 09:04:52.876897 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8grrb" event={"ID":"72099819-ce9e-4b04-808e-53b8cd0005a7","Type":"ContainerDied","Data":"cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf"} Oct 07 09:04:54 crc kubenswrapper[5025]: I1007 09:04:54.893008 5025 generic.go:334] "Generic (PLEG): container finished" podID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerID="a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6" exitCode=0 Oct 07 09:04:54 crc kubenswrapper[5025]: I1007 09:04:54.893073 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8grrb" event={"ID":"72099819-ce9e-4b04-808e-53b8cd0005a7","Type":"ContainerDied","Data":"a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6"} Oct 07 09:04:55 crc kubenswrapper[5025]: I1007 09:04:55.903368 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8grrb" event={"ID":"72099819-ce9e-4b04-808e-53b8cd0005a7","Type":"ContainerStarted","Data":"f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb"} Oct 07 09:04:55 crc kubenswrapper[5025]: I1007 09:04:55.933815 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:04:55 crc kubenswrapper[5025]: I1007 09:04:55.933859 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:04:55 crc kubenswrapper[5025]: I1007 09:04:55.940899 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 09:04:55 crc kubenswrapper[5025]: I1007 09:04:55.941676 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe800c18a8d341f800042ac77b698eed48f45b43ae3d96ec5c15efb400a54864"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 09:04:55 crc kubenswrapper[5025]: I1007 09:04:55.941735 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://fe800c18a8d341f800042ac77b698eed48f45b43ae3d96ec5c15efb400a54864" gracePeriod=600 Oct 07 09:04:55 crc kubenswrapper[5025]: I1007 09:04:55.943713 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8grrb" podStartSLOduration=2.488624893 podStartE2EDuration="4.943701098s" podCreationTimestamp="2025-10-07 09:04:51 +0000 UTC" firstStartedPulling="2025-10-07 09:04:52.878446052 +0000 UTC m=+2899.687760206" lastFinishedPulling="2025-10-07 09:04:55.333522267 +0000 UTC m=+2902.142836411" observedRunningTime="2025-10-07 09:04:55.941114506 +0000 UTC m=+2902.750428650" watchObservedRunningTime="2025-10-07 09:04:55.943701098 +0000 UTC m=+2902.753015242" Oct 07 09:04:56 crc kubenswrapper[5025]: I1007 09:04:56.912951 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="fe800c18a8d341f800042ac77b698eed48f45b43ae3d96ec5c15efb400a54864" exitCode=0 Oct 07 09:04:56 crc kubenswrapper[5025]: I1007 09:04:56.913043 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"fe800c18a8d341f800042ac77b698eed48f45b43ae3d96ec5c15efb400a54864"} Oct 07 09:04:56 crc kubenswrapper[5025]: I1007 09:04:56.913621 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87"} Oct 07 09:04:56 crc kubenswrapper[5025]: I1007 09:04:56.913678 5025 scope.go:117] "RemoveContainer" containerID="967d33b665b57c7aac6745863f1d7aca2edc1314d7134434954c9109b2a90f0a" Oct 07 09:05:01 crc kubenswrapper[5025]: I1007 09:05:01.424032 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:05:01 crc kubenswrapper[5025]: I1007 09:05:01.425885 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:05:01 crc kubenswrapper[5025]: I1007 09:05:01.483572 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:05:01 crc kubenswrapper[5025]: I1007 09:05:01.991741 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:05:02 crc kubenswrapper[5025]: I1007 09:05:02.031169 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8grrb"] Oct 07 09:05:03 crc kubenswrapper[5025]: I1007 09:05:03.958011 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8grrb" podUID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerName="registry-server" containerID="cri-o://f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb" gracePeriod=2 Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.331413 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.483006 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5lcd\" (UniqueName: \"kubernetes.io/projected/72099819-ce9e-4b04-808e-53b8cd0005a7-kube-api-access-q5lcd\") pod \"72099819-ce9e-4b04-808e-53b8cd0005a7\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.483236 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-utilities\") pod \"72099819-ce9e-4b04-808e-53b8cd0005a7\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.483679 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-catalog-content\") pod \"72099819-ce9e-4b04-808e-53b8cd0005a7\" (UID: \"72099819-ce9e-4b04-808e-53b8cd0005a7\") " Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.484022 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-utilities" (OuterVolumeSpecName: "utilities") pod "72099819-ce9e-4b04-808e-53b8cd0005a7" (UID: "72099819-ce9e-4b04-808e-53b8cd0005a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.489424 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72099819-ce9e-4b04-808e-53b8cd0005a7-kube-api-access-q5lcd" (OuterVolumeSpecName: "kube-api-access-q5lcd") pod "72099819-ce9e-4b04-808e-53b8cd0005a7" (UID: "72099819-ce9e-4b04-808e-53b8cd0005a7"). InnerVolumeSpecName "kube-api-access-q5lcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.584429 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5lcd\" (UniqueName: \"kubernetes.io/projected/72099819-ce9e-4b04-808e-53b8cd0005a7-kube-api-access-q5lcd\") on node \"crc\" DevicePath \"\"" Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.584468 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.970087 5025 generic.go:334] "Generic (PLEG): container finished" podID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerID="f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb" exitCode=0 Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.970145 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8grrb" event={"ID":"72099819-ce9e-4b04-808e-53b8cd0005a7","Type":"ContainerDied","Data":"f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb"} Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.970183 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8grrb" event={"ID":"72099819-ce9e-4b04-808e-53b8cd0005a7","Type":"ContainerDied","Data":"2c28c66ceb83ef8fabf2f17448a1e4504185d016086f89686ee2281ed50477c1"} Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.970206 5025 scope.go:117] "RemoveContainer" containerID="f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb" Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.970265 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8grrb" Oct 07 09:05:04 crc kubenswrapper[5025]: I1007 09:05:04.999359 5025 scope.go:117] "RemoveContainer" containerID="a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.018637 5025 scope.go:117] "RemoveContainer" containerID="cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.044758 5025 scope.go:117] "RemoveContainer" containerID="f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb" Oct 07 09:05:05 crc kubenswrapper[5025]: E1007 09:05:05.045412 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb\": container with ID starting with f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb not found: ID does not exist" containerID="f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.045449 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb"} err="failed to get container status \"f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb\": rpc error: code = NotFound desc = could not find container \"f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb\": container with ID starting with f4f5f58914128f7c581c9a68c37578911829e1aba9486e18e72fe16a2a3f5ccb not found: ID does not exist" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.045475 5025 scope.go:117] "RemoveContainer" containerID="a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6" Oct 07 09:05:05 crc kubenswrapper[5025]: E1007 09:05:05.046806 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6\": container with ID starting with a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6 not found: ID does not exist" containerID="a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.046857 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6"} err="failed to get container status \"a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6\": rpc error: code = NotFound desc = could not find container \"a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6\": container with ID starting with a450cb272d53e55d46b7f23655eeb43314064c6004610abbaa64b4ab1e631ea6 not found: ID does not exist" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.046894 5025 scope.go:117] "RemoveContainer" containerID="cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf" Oct 07 09:05:05 crc kubenswrapper[5025]: E1007 09:05:05.047314 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf\": container with ID starting with cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf not found: ID does not exist" containerID="cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.047348 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf"} err="failed to get container status \"cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf\": rpc error: code = NotFound desc = could not find container \"cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf\": container with ID starting with cc585e1bc00c5d5284c720217023c8b243b0db068f7c46c5ab278ba100fc29cf not found: ID does not exist" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.148404 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72099819-ce9e-4b04-808e-53b8cd0005a7" (UID: "72099819-ce9e-4b04-808e-53b8cd0005a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.191745 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72099819-ce9e-4b04-808e-53b8cd0005a7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.311525 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8grrb"] Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.319358 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8grrb"] Oct 07 09:05:05 crc kubenswrapper[5025]: I1007 09:05:05.923937 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72099819-ce9e-4b04-808e-53b8cd0005a7" path="/var/lib/kubelet/pods/72099819-ce9e-4b04-808e-53b8cd0005a7/volumes" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.169790 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tvx7v"] Oct 07 09:05:54 crc kubenswrapper[5025]: E1007 09:05:54.171311 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerName="extract-utilities" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.171333 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerName="extract-utilities" Oct 07 09:05:54 crc kubenswrapper[5025]: E1007 09:05:54.171361 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerName="extract-content" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.171370 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerName="extract-content" Oct 07 09:05:54 crc kubenswrapper[5025]: E1007 09:05:54.171401 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerName="registry-server" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.171411 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerName="registry-server" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.171656 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="72099819-ce9e-4b04-808e-53b8cd0005a7" containerName="registry-server" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.173253 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.223244 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvx7v"] Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.310147 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znk6t\" (UniqueName: \"kubernetes.io/projected/2ea012ed-a004-4f18-83a4-bd7bb82794a5-kube-api-access-znk6t\") pod \"community-operators-tvx7v\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.310238 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-utilities\") pod \"community-operators-tvx7v\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.310271 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-catalog-content\") pod \"community-operators-tvx7v\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.411473 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znk6t\" (UniqueName: \"kubernetes.io/projected/2ea012ed-a004-4f18-83a4-bd7bb82794a5-kube-api-access-znk6t\") pod \"community-operators-tvx7v\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.411561 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-utilities\") pod \"community-operators-tvx7v\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.411583 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-catalog-content\") pod \"community-operators-tvx7v\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.412239 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-utilities\") pod \"community-operators-tvx7v\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.412295 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-catalog-content\") pod \"community-operators-tvx7v\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.434847 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znk6t\" (UniqueName: \"kubernetes.io/projected/2ea012ed-a004-4f18-83a4-bd7bb82794a5-kube-api-access-znk6t\") pod \"community-operators-tvx7v\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.545992 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:05:54 crc kubenswrapper[5025]: I1007 09:05:54.920849 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvx7v"] Oct 07 09:05:55 crc kubenswrapper[5025]: I1007 09:05:55.372065 5025 generic.go:334] "Generic (PLEG): container finished" podID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerID="672240e9c4a54639f5bcb7c38820b8d14fc8903ec82ad67fb18c8ee0ed411141" exitCode=0 Oct 07 09:05:55 crc kubenswrapper[5025]: I1007 09:05:55.372137 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvx7v" event={"ID":"2ea012ed-a004-4f18-83a4-bd7bb82794a5","Type":"ContainerDied","Data":"672240e9c4a54639f5bcb7c38820b8d14fc8903ec82ad67fb18c8ee0ed411141"} Oct 07 09:05:55 crc kubenswrapper[5025]: I1007 09:05:55.372177 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvx7v" event={"ID":"2ea012ed-a004-4f18-83a4-bd7bb82794a5","Type":"ContainerStarted","Data":"a84727a3d0830ed5a013dff2b905d2ec00e8c75aad093d2b179fc2595981164e"} Oct 07 09:05:56 crc kubenswrapper[5025]: I1007 09:05:56.382731 5025 generic.go:334] "Generic (PLEG): container finished" podID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerID="8d62ed3d66986777a948d8cf15b21a03d7a104154ea2b8420888185ae3e24df3" exitCode=0 Oct 07 09:05:56 crc kubenswrapper[5025]: I1007 09:05:56.382809 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvx7v" event={"ID":"2ea012ed-a004-4f18-83a4-bd7bb82794a5","Type":"ContainerDied","Data":"8d62ed3d66986777a948d8cf15b21a03d7a104154ea2b8420888185ae3e24df3"} Oct 07 09:05:57 crc kubenswrapper[5025]: I1007 09:05:57.394381 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvx7v" event={"ID":"2ea012ed-a004-4f18-83a4-bd7bb82794a5","Type":"ContainerStarted","Data":"5994acb3778064bc07097f3a2a310da5cd7951a7cab5fc5cb0b88ff4c6536d08"} Oct 07 09:05:57 crc kubenswrapper[5025]: I1007 09:05:57.416696 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tvx7v" podStartSLOduration=1.847482892 podStartE2EDuration="3.416661363s" podCreationTimestamp="2025-10-07 09:05:54 +0000 UTC" firstStartedPulling="2025-10-07 09:05:55.37370976 +0000 UTC m=+2962.183023904" lastFinishedPulling="2025-10-07 09:05:56.942888231 +0000 UTC m=+2963.752202375" observedRunningTime="2025-10-07 09:05:57.413843044 +0000 UTC m=+2964.223157188" watchObservedRunningTime="2025-10-07 09:05:57.416661363 +0000 UTC m=+2964.225975527" Oct 07 09:06:04 crc kubenswrapper[5025]: I1007 09:06:04.546934 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:06:04 crc kubenswrapper[5025]: I1007 09:06:04.547669 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:06:04 crc kubenswrapper[5025]: I1007 09:06:04.590760 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:06:05 crc kubenswrapper[5025]: I1007 09:06:05.519691 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:06:05 crc kubenswrapper[5025]: I1007 09:06:05.567599 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tvx7v"] Oct 07 09:06:07 crc kubenswrapper[5025]: I1007 09:06:07.484528 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tvx7v" podUID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerName="registry-server" containerID="cri-o://5994acb3778064bc07097f3a2a310da5cd7951a7cab5fc5cb0b88ff4c6536d08" gracePeriod=2 Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.492951 5025 generic.go:334] "Generic (PLEG): container finished" podID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerID="5994acb3778064bc07097f3a2a310da5cd7951a7cab5fc5cb0b88ff4c6536d08" exitCode=0 Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.493044 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvx7v" event={"ID":"2ea012ed-a004-4f18-83a4-bd7bb82794a5","Type":"ContainerDied","Data":"5994acb3778064bc07097f3a2a310da5cd7951a7cab5fc5cb0b88ff4c6536d08"} Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.599240 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.661572 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znk6t\" (UniqueName: \"kubernetes.io/projected/2ea012ed-a004-4f18-83a4-bd7bb82794a5-kube-api-access-znk6t\") pod \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.661697 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-utilities\") pod \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.661945 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-catalog-content\") pod \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\" (UID: \"2ea012ed-a004-4f18-83a4-bd7bb82794a5\") " Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.663322 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-utilities" (OuterVolumeSpecName: "utilities") pod "2ea012ed-a004-4f18-83a4-bd7bb82794a5" (UID: "2ea012ed-a004-4f18-83a4-bd7bb82794a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.669844 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea012ed-a004-4f18-83a4-bd7bb82794a5-kube-api-access-znk6t" (OuterVolumeSpecName: "kube-api-access-znk6t") pod "2ea012ed-a004-4f18-83a4-bd7bb82794a5" (UID: "2ea012ed-a004-4f18-83a4-bd7bb82794a5"). InnerVolumeSpecName "kube-api-access-znk6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.715205 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ea012ed-a004-4f18-83a4-bd7bb82794a5" (UID: "2ea012ed-a004-4f18-83a4-bd7bb82794a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.764489 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znk6t\" (UniqueName: \"kubernetes.io/projected/2ea012ed-a004-4f18-83a4-bd7bb82794a5-kube-api-access-znk6t\") on node \"crc\" DevicePath \"\"" Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.764531 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:06:08 crc kubenswrapper[5025]: I1007 09:06:08.764569 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea012ed-a004-4f18-83a4-bd7bb82794a5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:06:09 crc kubenswrapper[5025]: I1007 09:06:09.506557 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvx7v" event={"ID":"2ea012ed-a004-4f18-83a4-bd7bb82794a5","Type":"ContainerDied","Data":"a84727a3d0830ed5a013dff2b905d2ec00e8c75aad093d2b179fc2595981164e"} Oct 07 09:06:09 crc kubenswrapper[5025]: I1007 09:06:09.506617 5025 scope.go:117] "RemoveContainer" containerID="5994acb3778064bc07097f3a2a310da5cd7951a7cab5fc5cb0b88ff4c6536d08" Oct 07 09:06:09 crc kubenswrapper[5025]: I1007 09:06:09.506671 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvx7v" Oct 07 09:06:09 crc kubenswrapper[5025]: I1007 09:06:09.532862 5025 scope.go:117] "RemoveContainer" containerID="8d62ed3d66986777a948d8cf15b21a03d7a104154ea2b8420888185ae3e24df3" Oct 07 09:06:09 crc kubenswrapper[5025]: I1007 09:06:09.555352 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tvx7v"] Oct 07 09:06:09 crc kubenswrapper[5025]: I1007 09:06:09.563302 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tvx7v"] Oct 07 09:06:09 crc kubenswrapper[5025]: I1007 09:06:09.567645 5025 scope.go:117] "RemoveContainer" containerID="672240e9c4a54639f5bcb7c38820b8d14fc8903ec82ad67fb18c8ee0ed411141" Oct 07 09:06:09 crc kubenswrapper[5025]: I1007 09:06:09.922756 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" path="/var/lib/kubelet/pods/2ea012ed-a004-4f18-83a4-bd7bb82794a5/volumes" Oct 07 09:07:25 crc kubenswrapper[5025]: I1007 09:07:25.933904 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:07:25 crc kubenswrapper[5025]: I1007 09:07:25.934421 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:07:55 crc kubenswrapper[5025]: I1007 09:07:55.934287 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:07:55 crc kubenswrapper[5025]: I1007 09:07:55.934907 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:08:25 crc kubenswrapper[5025]: I1007 09:08:25.936809 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:08:25 crc kubenswrapper[5025]: I1007 09:08:25.937465 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:08:25 crc kubenswrapper[5025]: I1007 09:08:25.937550 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 09:08:25 crc kubenswrapper[5025]: I1007 09:08:25.939905 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 09:08:25 crc kubenswrapper[5025]: I1007 09:08:25.940015 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" gracePeriod=600 Oct 07 09:08:26 crc kubenswrapper[5025]: E1007 09:08:26.075283 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:08:26 crc kubenswrapper[5025]: I1007 09:08:26.523070 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" exitCode=0 Oct 07 09:08:26 crc kubenswrapper[5025]: I1007 09:08:26.523155 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87"} Oct 07 09:08:26 crc kubenswrapper[5025]: I1007 09:08:26.523248 5025 scope.go:117] "RemoveContainer" containerID="fe800c18a8d341f800042ac77b698eed48f45b43ae3d96ec5c15efb400a54864" Oct 07 09:08:26 crc kubenswrapper[5025]: I1007 09:08:26.524012 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:08:26 crc kubenswrapper[5025]: E1007 09:08:26.524259 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:08:41 crc kubenswrapper[5025]: I1007 09:08:41.914794 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:08:41 crc kubenswrapper[5025]: E1007 09:08:41.915610 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:08:56 crc kubenswrapper[5025]: I1007 09:08:56.915322 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:08:56 crc kubenswrapper[5025]: E1007 09:08:56.916414 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:09:11 crc kubenswrapper[5025]: I1007 09:09:11.914976 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:09:11 crc kubenswrapper[5025]: E1007 09:09:11.915799 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:09:25 crc kubenswrapper[5025]: I1007 09:09:25.915048 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:09:25 crc kubenswrapper[5025]: E1007 09:09:25.916164 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:09:37 crc kubenswrapper[5025]: I1007 09:09:37.915234 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:09:37 crc kubenswrapper[5025]: E1007 09:09:37.916082 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:09:52 crc kubenswrapper[5025]: I1007 09:09:52.915226 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:09:52 crc kubenswrapper[5025]: E1007 09:09:52.916049 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:10:04 crc kubenswrapper[5025]: I1007 09:10:04.915283 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:10:04 crc kubenswrapper[5025]: E1007 09:10:04.916226 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:10:15 crc kubenswrapper[5025]: I1007 09:10:15.922277 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:10:15 crc kubenswrapper[5025]: E1007 09:10:15.923924 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:10:30 crc kubenswrapper[5025]: I1007 09:10:30.915142 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:10:30 crc kubenswrapper[5025]: E1007 09:10:30.916891 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:10:45 crc kubenswrapper[5025]: I1007 09:10:45.915410 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:10:45 crc kubenswrapper[5025]: E1007 09:10:45.917252 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:10:57 crc kubenswrapper[5025]: I1007 09:10:57.914711 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:10:57 crc kubenswrapper[5025]: E1007 09:10:57.915582 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:11:10 crc kubenswrapper[5025]: I1007 09:11:10.914790 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:11:10 crc kubenswrapper[5025]: E1007 09:11:10.916029 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:11:25 crc kubenswrapper[5025]: I1007 09:11:25.915361 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:11:25 crc kubenswrapper[5025]: E1007 09:11:25.916180 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.066001 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9lg2n"] Oct 07 09:11:27 crc kubenswrapper[5025]: E1007 09:11:27.066707 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerName="registry-server" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.066723 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerName="registry-server" Oct 07 09:11:27 crc kubenswrapper[5025]: E1007 09:11:27.066740 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerName="extract-content" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.066748 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerName="extract-content" Oct 07 09:11:27 crc kubenswrapper[5025]: E1007 09:11:27.066770 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerName="extract-utilities" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.066778 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerName="extract-utilities" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.066939 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea012ed-a004-4f18-83a4-bd7bb82794a5" containerName="registry-server" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.068146 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.086109 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lg2n"] Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.206655 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89n8n\" (UniqueName: \"kubernetes.io/projected/8706223a-56af-47b4-b2e6-d42092918f29-kube-api-access-89n8n\") pod \"certified-operators-9lg2n\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.206742 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-catalog-content\") pod \"certified-operators-9lg2n\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.206810 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-utilities\") pod \"certified-operators-9lg2n\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.308008 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-catalog-content\") pod \"certified-operators-9lg2n\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.308092 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-utilities\") pod \"certified-operators-9lg2n\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.308150 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89n8n\" (UniqueName: \"kubernetes.io/projected/8706223a-56af-47b4-b2e6-d42092918f29-kube-api-access-89n8n\") pod \"certified-operators-9lg2n\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.308573 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-catalog-content\") pod \"certified-operators-9lg2n\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.308666 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-utilities\") pod \"certified-operators-9lg2n\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.341801 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89n8n\" (UniqueName: \"kubernetes.io/projected/8706223a-56af-47b4-b2e6-d42092918f29-kube-api-access-89n8n\") pod \"certified-operators-9lg2n\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.391899 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:27 crc kubenswrapper[5025]: I1007 09:11:27.952910 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lg2n"] Oct 07 09:11:28 crc kubenswrapper[5025]: I1007 09:11:28.926775 5025 generic.go:334] "Generic (PLEG): container finished" podID="8706223a-56af-47b4-b2e6-d42092918f29" containerID="2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3" exitCode=0 Oct 07 09:11:28 crc kubenswrapper[5025]: I1007 09:11:28.926823 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lg2n" event={"ID":"8706223a-56af-47b4-b2e6-d42092918f29","Type":"ContainerDied","Data":"2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3"} Oct 07 09:11:28 crc kubenswrapper[5025]: I1007 09:11:28.927213 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lg2n" event={"ID":"8706223a-56af-47b4-b2e6-d42092918f29","Type":"ContainerStarted","Data":"6d369c4f4c9689a0cd2b44100fac9559f7aeb38f132c2ad0f1b64d0b1eda2146"} Oct 07 09:11:28 crc kubenswrapper[5025]: I1007 09:11:28.929205 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 09:11:30 crc kubenswrapper[5025]: I1007 09:11:30.946092 5025 generic.go:334] "Generic (PLEG): container finished" podID="8706223a-56af-47b4-b2e6-d42092918f29" containerID="cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a" exitCode=0 Oct 07 09:11:30 crc kubenswrapper[5025]: I1007 09:11:30.946188 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lg2n" event={"ID":"8706223a-56af-47b4-b2e6-d42092918f29","Type":"ContainerDied","Data":"cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a"} Oct 07 09:11:31 crc kubenswrapper[5025]: I1007 09:11:31.959200 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lg2n" event={"ID":"8706223a-56af-47b4-b2e6-d42092918f29","Type":"ContainerStarted","Data":"1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f"} Oct 07 09:11:31 crc kubenswrapper[5025]: I1007 09:11:31.985336 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9lg2n" podStartSLOduration=2.402942622 podStartE2EDuration="4.985301562s" podCreationTimestamp="2025-10-07 09:11:27 +0000 UTC" firstStartedPulling="2025-10-07 09:11:28.928821959 +0000 UTC m=+3295.738136103" lastFinishedPulling="2025-10-07 09:11:31.511180899 +0000 UTC m=+3298.320495043" observedRunningTime="2025-10-07 09:11:31.982968148 +0000 UTC m=+3298.792282302" watchObservedRunningTime="2025-10-07 09:11:31.985301562 +0000 UTC m=+3298.794615706" Oct 07 09:11:37 crc kubenswrapper[5025]: I1007 09:11:37.392802 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:37 crc kubenswrapper[5025]: I1007 09:11:37.393183 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:37 crc kubenswrapper[5025]: I1007 09:11:37.434162 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:38 crc kubenswrapper[5025]: I1007 09:11:38.057965 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:38 crc kubenswrapper[5025]: I1007 09:11:38.119572 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lg2n"] Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.031193 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9lg2n" podUID="8706223a-56af-47b4-b2e6-d42092918f29" containerName="registry-server" containerID="cri-o://1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f" gracePeriod=2 Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.404492 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.541408 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-catalog-content\") pod \"8706223a-56af-47b4-b2e6-d42092918f29\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.541485 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-utilities\") pod \"8706223a-56af-47b4-b2e6-d42092918f29\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.541524 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89n8n\" (UniqueName: \"kubernetes.io/projected/8706223a-56af-47b4-b2e6-d42092918f29-kube-api-access-89n8n\") pod \"8706223a-56af-47b4-b2e6-d42092918f29\" (UID: \"8706223a-56af-47b4-b2e6-d42092918f29\") " Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.542773 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-utilities" (OuterVolumeSpecName: "utilities") pod "8706223a-56af-47b4-b2e6-d42092918f29" (UID: "8706223a-56af-47b4-b2e6-d42092918f29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.550055 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8706223a-56af-47b4-b2e6-d42092918f29-kube-api-access-89n8n" (OuterVolumeSpecName: "kube-api-access-89n8n") pod "8706223a-56af-47b4-b2e6-d42092918f29" (UID: "8706223a-56af-47b4-b2e6-d42092918f29"). InnerVolumeSpecName "kube-api-access-89n8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.597757 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8706223a-56af-47b4-b2e6-d42092918f29" (UID: "8706223a-56af-47b4-b2e6-d42092918f29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.643390 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.643453 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89n8n\" (UniqueName: \"kubernetes.io/projected/8706223a-56af-47b4-b2e6-d42092918f29-kube-api-access-89n8n\") on node \"crc\" DevicePath \"\"" Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.643465 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8706223a-56af-47b4-b2e6-d42092918f29-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:11:40 crc kubenswrapper[5025]: I1007 09:11:40.915849 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:11:40 crc kubenswrapper[5025]: E1007 09:11:40.916261 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.041090 5025 generic.go:334] "Generic (PLEG): container finished" podID="8706223a-56af-47b4-b2e6-d42092918f29" containerID="1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f" exitCode=0 Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.041218 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lg2n" event={"ID":"8706223a-56af-47b4-b2e6-d42092918f29","Type":"ContainerDied","Data":"1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f"} Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.041265 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lg2n" event={"ID":"8706223a-56af-47b4-b2e6-d42092918f29","Type":"ContainerDied","Data":"6d369c4f4c9689a0cd2b44100fac9559f7aeb38f132c2ad0f1b64d0b1eda2146"} Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.041286 5025 scope.go:117] "RemoveContainer" containerID="1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.041292 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lg2n" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.079813 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lg2n"] Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.084803 5025 scope.go:117] "RemoveContainer" containerID="cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.087473 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9lg2n"] Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.109640 5025 scope.go:117] "RemoveContainer" containerID="2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.136113 5025 scope.go:117] "RemoveContainer" containerID="1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f" Oct 07 09:11:41 crc kubenswrapper[5025]: E1007 09:11:41.136857 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f\": container with ID starting with 1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f not found: ID does not exist" containerID="1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.136940 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f"} err="failed to get container status \"1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f\": rpc error: code = NotFound desc = could not find container \"1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f\": container with ID starting with 1552a97c785f25dafc4c25a063608332858614349b34afb5b32ed12e7909d26f not found: ID does not exist" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.136991 5025 scope.go:117] "RemoveContainer" containerID="cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a" Oct 07 09:11:41 crc kubenswrapper[5025]: E1007 09:11:41.137676 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a\": container with ID starting with cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a not found: ID does not exist" containerID="cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.137764 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a"} err="failed to get container status \"cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a\": rpc error: code = NotFound desc = could not find container \"cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a\": container with ID starting with cbb023f4c3a65560bc80ff3966e3338fc0258763e64952d45a80c04b91047a8a not found: ID does not exist" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.137803 5025 scope.go:117] "RemoveContainer" containerID="2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3" Oct 07 09:11:41 crc kubenswrapper[5025]: E1007 09:11:41.138397 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3\": container with ID starting with 2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3 not found: ID does not exist" containerID="2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.138753 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3"} err="failed to get container status \"2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3\": rpc error: code = NotFound desc = could not find container \"2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3\": container with ID starting with 2840a24ca06d422d78330df3d4aed55fa2451e3d496ef8524bdb41cdaa121ae3 not found: ID does not exist" Oct 07 09:11:41 crc kubenswrapper[5025]: I1007 09:11:41.924729 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8706223a-56af-47b4-b2e6-d42092918f29" path="/var/lib/kubelet/pods/8706223a-56af-47b4-b2e6-d42092918f29/volumes" Oct 07 09:11:53 crc kubenswrapper[5025]: I1007 09:11:53.919852 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:11:53 crc kubenswrapper[5025]: E1007 09:11:53.920921 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:12:05 crc kubenswrapper[5025]: I1007 09:12:05.915416 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:12:05 crc kubenswrapper[5025]: E1007 09:12:05.916405 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:12:18 crc kubenswrapper[5025]: I1007 09:12:18.914531 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:12:18 crc kubenswrapper[5025]: E1007 09:12:18.915462 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:12:29 crc kubenswrapper[5025]: I1007 09:12:29.915182 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:12:29 crc kubenswrapper[5025]: E1007 09:12:29.916007 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:12:42 crc kubenswrapper[5025]: I1007 09:12:42.914645 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:12:42 crc kubenswrapper[5025]: E1007 09:12:42.915409 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:12:56 crc kubenswrapper[5025]: I1007 09:12:56.914673 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:12:56 crc kubenswrapper[5025]: E1007 09:12:56.916067 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:13:11 crc kubenswrapper[5025]: I1007 09:13:11.915338 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:13:11 crc kubenswrapper[5025]: E1007 09:13:11.917085 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:13:22 crc kubenswrapper[5025]: I1007 09:13:22.915144 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:13:22 crc kubenswrapper[5025]: E1007 09:13:22.915884 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:13:35 crc kubenswrapper[5025]: I1007 09:13:35.914081 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:13:36 crc kubenswrapper[5025]: I1007 09:13:36.927450 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"b5abd8f3ab0bf8a06bd24a57352f13563a3f5b1e22b0a7f93ba506beef5c9513"} Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.504684 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nk7nl"] Oct 07 09:13:37 crc kubenswrapper[5025]: E1007 09:13:37.505295 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8706223a-56af-47b4-b2e6-d42092918f29" containerName="extract-content" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.505314 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8706223a-56af-47b4-b2e6-d42092918f29" containerName="extract-content" Oct 07 09:13:37 crc kubenswrapper[5025]: E1007 09:13:37.505326 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8706223a-56af-47b4-b2e6-d42092918f29" containerName="registry-server" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.505332 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8706223a-56af-47b4-b2e6-d42092918f29" containerName="registry-server" Oct 07 09:13:37 crc kubenswrapper[5025]: E1007 09:13:37.505343 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8706223a-56af-47b4-b2e6-d42092918f29" containerName="extract-utilities" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.505349 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8706223a-56af-47b4-b2e6-d42092918f29" containerName="extract-utilities" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.505485 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8706223a-56af-47b4-b2e6-d42092918f29" containerName="registry-server" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.506530 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.515234 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nk7nl"] Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.598345 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-catalog-content\") pod \"redhat-marketplace-nk7nl\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.598456 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btgj2\" (UniqueName: \"kubernetes.io/projected/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-kube-api-access-btgj2\") pod \"redhat-marketplace-nk7nl\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.598533 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-utilities\") pod \"redhat-marketplace-nk7nl\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.699327 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-catalog-content\") pod \"redhat-marketplace-nk7nl\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.699623 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btgj2\" (UniqueName: \"kubernetes.io/projected/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-kube-api-access-btgj2\") pod \"redhat-marketplace-nk7nl\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.699715 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-utilities\") pod \"redhat-marketplace-nk7nl\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.699881 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-catalog-content\") pod \"redhat-marketplace-nk7nl\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.700113 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-utilities\") pod \"redhat-marketplace-nk7nl\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.718017 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btgj2\" (UniqueName: \"kubernetes.io/projected/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-kube-api-access-btgj2\") pod \"redhat-marketplace-nk7nl\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:37 crc kubenswrapper[5025]: I1007 09:13:37.826846 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:38 crc kubenswrapper[5025]: I1007 09:13:38.260986 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nk7nl"] Oct 07 09:13:38 crc kubenswrapper[5025]: I1007 09:13:38.953767 5025 generic.go:334] "Generic (PLEG): container finished" podID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerID="cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8" exitCode=0 Oct 07 09:13:38 crc kubenswrapper[5025]: I1007 09:13:38.953808 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nk7nl" event={"ID":"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0","Type":"ContainerDied","Data":"cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8"} Oct 07 09:13:38 crc kubenswrapper[5025]: I1007 09:13:38.954263 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nk7nl" event={"ID":"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0","Type":"ContainerStarted","Data":"d0723174b8d20040e4f59ce867f1886db6a35ddeeb924154701cb06849e8f579"} Oct 07 09:13:39 crc kubenswrapper[5025]: I1007 09:13:39.962147 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nk7nl" event={"ID":"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0","Type":"ContainerStarted","Data":"5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9"} Oct 07 09:13:40 crc kubenswrapper[5025]: I1007 09:13:40.971486 5025 generic.go:334] "Generic (PLEG): container finished" podID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerID="5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9" exitCode=0 Oct 07 09:13:40 crc kubenswrapper[5025]: I1007 09:13:40.971562 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nk7nl" event={"ID":"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0","Type":"ContainerDied","Data":"5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9"} Oct 07 09:13:41 crc kubenswrapper[5025]: I1007 09:13:41.981529 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nk7nl" event={"ID":"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0","Type":"ContainerStarted","Data":"7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def"} Oct 07 09:13:42 crc kubenswrapper[5025]: I1007 09:13:42.004057 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nk7nl" podStartSLOduration=2.491345788 podStartE2EDuration="5.004034126s" podCreationTimestamp="2025-10-07 09:13:37 +0000 UTC" firstStartedPulling="2025-10-07 09:13:38.95627314 +0000 UTC m=+3425.765587284" lastFinishedPulling="2025-10-07 09:13:41.468961478 +0000 UTC m=+3428.278275622" observedRunningTime="2025-10-07 09:13:42.00004936 +0000 UTC m=+3428.809363514" watchObservedRunningTime="2025-10-07 09:13:42.004034126 +0000 UTC m=+3428.813348270" Oct 07 09:13:47 crc kubenswrapper[5025]: I1007 09:13:47.827680 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:47 crc kubenswrapper[5025]: I1007 09:13:47.828272 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:47 crc kubenswrapper[5025]: I1007 09:13:47.878125 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:48 crc kubenswrapper[5025]: I1007 09:13:48.075993 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:48 crc kubenswrapper[5025]: I1007 09:13:48.124839 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nk7nl"] Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.049832 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nk7nl" podUID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerName="registry-server" containerID="cri-o://7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def" gracePeriod=2 Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.432152 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.485417 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-catalog-content\") pod \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.485551 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btgj2\" (UniqueName: \"kubernetes.io/projected/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-kube-api-access-btgj2\") pod \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.485657 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-utilities\") pod \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\" (UID: \"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0\") " Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.486633 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-utilities" (OuterVolumeSpecName: "utilities") pod "e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" (UID: "e3cfea13-cc95-4e23-a7ad-ef223b2d65c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.491426 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-kube-api-access-btgj2" (OuterVolumeSpecName: "kube-api-access-btgj2") pod "e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" (UID: "e3cfea13-cc95-4e23-a7ad-ef223b2d65c0"). InnerVolumeSpecName "kube-api-access-btgj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.502882 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" (UID: "e3cfea13-cc95-4e23-a7ad-ef223b2d65c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.587211 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btgj2\" (UniqueName: \"kubernetes.io/projected/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-kube-api-access-btgj2\") on node \"crc\" DevicePath \"\"" Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.587240 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:13:50 crc kubenswrapper[5025]: I1007 09:13:50.587249 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.062723 5025 generic.go:334] "Generic (PLEG): container finished" podID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerID="7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def" exitCode=0 Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.063001 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nk7nl" event={"ID":"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0","Type":"ContainerDied","Data":"7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def"} Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.063137 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nk7nl" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.063982 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nk7nl" event={"ID":"e3cfea13-cc95-4e23-a7ad-ef223b2d65c0","Type":"ContainerDied","Data":"d0723174b8d20040e4f59ce867f1886db6a35ddeeb924154701cb06849e8f579"} Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.064068 5025 scope.go:117] "RemoveContainer" containerID="7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.105643 5025 scope.go:117] "RemoveContainer" containerID="5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.115183 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nk7nl"] Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.125143 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nk7nl"] Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.139821 5025 scope.go:117] "RemoveContainer" containerID="cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.161586 5025 scope.go:117] "RemoveContainer" containerID="7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def" Oct 07 09:13:51 crc kubenswrapper[5025]: E1007 09:13:51.162260 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def\": container with ID starting with 7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def not found: ID does not exist" containerID="7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.162331 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def"} err="failed to get container status \"7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def\": rpc error: code = NotFound desc = could not find container \"7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def\": container with ID starting with 7a14be7e9b3d60d4b1e431ab7826fc5037d9507993a31b9c414cdc3573a23def not found: ID does not exist" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.162369 5025 scope.go:117] "RemoveContainer" containerID="5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9" Oct 07 09:13:51 crc kubenswrapper[5025]: E1007 09:13:51.162813 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9\": container with ID starting with 5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9 not found: ID does not exist" containerID="5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.162853 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9"} err="failed to get container status \"5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9\": rpc error: code = NotFound desc = could not find container \"5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9\": container with ID starting with 5e90c61a3971353226ae64853737f3b5fce28c37fe0a797a1b8a204085e743d9 not found: ID does not exist" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.162878 5025 scope.go:117] "RemoveContainer" containerID="cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8" Oct 07 09:13:51 crc kubenswrapper[5025]: E1007 09:13:51.163328 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8\": container with ID starting with cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8 not found: ID does not exist" containerID="cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.163370 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8"} err="failed to get container status \"cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8\": rpc error: code = NotFound desc = could not find container \"cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8\": container with ID starting with cd21139f3a9a2c9253f32380294954eeec917c6e2717f9085a3c8869d277d3c8 not found: ID does not exist" Oct 07 09:13:51 crc kubenswrapper[5025]: I1007 09:13:51.926123 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" path="/var/lib/kubelet/pods/e3cfea13-cc95-4e23-a7ad-ef223b2d65c0/volumes" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.166966 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws"] Oct 07 09:15:00 crc kubenswrapper[5025]: E1007 09:15:00.167804 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerName="registry-server" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.167823 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerName="registry-server" Oct 07 09:15:00 crc kubenswrapper[5025]: E1007 09:15:00.167842 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerName="extract-utilities" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.167850 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerName="extract-utilities" Oct 07 09:15:00 crc kubenswrapper[5025]: E1007 09:15:00.167873 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerName="extract-content" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.167884 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerName="extract-content" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.168072 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3cfea13-cc95-4e23-a7ad-ef223b2d65c0" containerName="registry-server" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.168865 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.171051 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.171219 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.178593 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws"] Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.191521 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30ace14-d87f-428d-97c2-16f720c2e6de-secret-volume\") pod \"collect-profiles-29330475-zrjws\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.191666 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30ace14-d87f-428d-97c2-16f720c2e6de-config-volume\") pod \"collect-profiles-29330475-zrjws\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.191732 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdkqq\" (UniqueName: \"kubernetes.io/projected/d30ace14-d87f-428d-97c2-16f720c2e6de-kube-api-access-mdkqq\") pod \"collect-profiles-29330475-zrjws\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.292521 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkqq\" (UniqueName: \"kubernetes.io/projected/d30ace14-d87f-428d-97c2-16f720c2e6de-kube-api-access-mdkqq\") pod \"collect-profiles-29330475-zrjws\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.292725 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30ace14-d87f-428d-97c2-16f720c2e6de-secret-volume\") pod \"collect-profiles-29330475-zrjws\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.292801 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30ace14-d87f-428d-97c2-16f720c2e6de-config-volume\") pod \"collect-profiles-29330475-zrjws\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.293912 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30ace14-d87f-428d-97c2-16f720c2e6de-config-volume\") pod \"collect-profiles-29330475-zrjws\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.302734 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30ace14-d87f-428d-97c2-16f720c2e6de-secret-volume\") pod \"collect-profiles-29330475-zrjws\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.311562 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdkqq\" (UniqueName: \"kubernetes.io/projected/d30ace14-d87f-428d-97c2-16f720c2e6de-kube-api-access-mdkqq\") pod \"collect-profiles-29330475-zrjws\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.494711 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:00 crc kubenswrapper[5025]: I1007 09:15:00.935066 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws"] Oct 07 09:15:01 crc kubenswrapper[5025]: I1007 09:15:01.656842 5025 generic.go:334] "Generic (PLEG): container finished" podID="d30ace14-d87f-428d-97c2-16f720c2e6de" containerID="af50bca7b747e34b260be78cf66593e4280a01e0597b103cc47601e5f96923f5" exitCode=0 Oct 07 09:15:01 crc kubenswrapper[5025]: I1007 09:15:01.656908 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" event={"ID":"d30ace14-d87f-428d-97c2-16f720c2e6de","Type":"ContainerDied","Data":"af50bca7b747e34b260be78cf66593e4280a01e0597b103cc47601e5f96923f5"} Oct 07 09:15:01 crc kubenswrapper[5025]: I1007 09:15:01.658266 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" event={"ID":"d30ace14-d87f-428d-97c2-16f720c2e6de","Type":"ContainerStarted","Data":"2840ca6be39db1dd94d1e6678e29affcc53238fe016b16fc2f9336194ebf31f7"} Oct 07 09:15:02 crc kubenswrapper[5025]: I1007 09:15:02.931908 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.037573 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdkqq\" (UniqueName: \"kubernetes.io/projected/d30ace14-d87f-428d-97c2-16f720c2e6de-kube-api-access-mdkqq\") pod \"d30ace14-d87f-428d-97c2-16f720c2e6de\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.038223 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30ace14-d87f-428d-97c2-16f720c2e6de-secret-volume\") pod \"d30ace14-d87f-428d-97c2-16f720c2e6de\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.038422 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30ace14-d87f-428d-97c2-16f720c2e6de-config-volume\") pod \"d30ace14-d87f-428d-97c2-16f720c2e6de\" (UID: \"d30ace14-d87f-428d-97c2-16f720c2e6de\") " Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.039417 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30ace14-d87f-428d-97c2-16f720c2e6de-config-volume" (OuterVolumeSpecName: "config-volume") pod "d30ace14-d87f-428d-97c2-16f720c2e6de" (UID: "d30ace14-d87f-428d-97c2-16f720c2e6de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.040150 5025 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30ace14-d87f-428d-97c2-16f720c2e6de-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.044390 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30ace14-d87f-428d-97c2-16f720c2e6de-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d30ace14-d87f-428d-97c2-16f720c2e6de" (UID: "d30ace14-d87f-428d-97c2-16f720c2e6de"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.044946 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30ace14-d87f-428d-97c2-16f720c2e6de-kube-api-access-mdkqq" (OuterVolumeSpecName: "kube-api-access-mdkqq") pod "d30ace14-d87f-428d-97c2-16f720c2e6de" (UID: "d30ace14-d87f-428d-97c2-16f720c2e6de"). InnerVolumeSpecName "kube-api-access-mdkqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.142379 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdkqq\" (UniqueName: \"kubernetes.io/projected/d30ace14-d87f-428d-97c2-16f720c2e6de-kube-api-access-mdkqq\") on node \"crc\" DevicePath \"\"" Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.142425 5025 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d30ace14-d87f-428d-97c2-16f720c2e6de-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.672779 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" event={"ID":"d30ace14-d87f-428d-97c2-16f720c2e6de","Type":"ContainerDied","Data":"2840ca6be39db1dd94d1e6678e29affcc53238fe016b16fc2f9336194ebf31f7"} Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.672833 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2840ca6be39db1dd94d1e6678e29affcc53238fe016b16fc2f9336194ebf31f7" Oct 07 09:15:03 crc kubenswrapper[5025]: I1007 09:15:03.673261 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330475-zrjws" Oct 07 09:15:04 crc kubenswrapper[5025]: I1007 09:15:04.003154 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g"] Oct 07 09:15:04 crc kubenswrapper[5025]: I1007 09:15:04.009181 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330430-2f22g"] Oct 07 09:15:05 crc kubenswrapper[5025]: I1007 09:15:05.925806 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001b1491-5b33-47a0-a6ba-6d982c0df2da" path="/var/lib/kubelet/pods/001b1491-5b33-47a0-a6ba-6d982c0df2da/volumes" Oct 07 09:15:39 crc kubenswrapper[5025]: I1007 09:15:39.180655 5025 scope.go:117] "RemoveContainer" containerID="36789b3c4f8d3fb913309244bd63d467b9dd4f3e0919cede357288051041315b" Oct 07 09:15:39 crc kubenswrapper[5025]: I1007 09:15:39.839935 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kldq9"] Oct 07 09:15:39 crc kubenswrapper[5025]: E1007 09:15:39.840248 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30ace14-d87f-428d-97c2-16f720c2e6de" containerName="collect-profiles" Oct 07 09:15:39 crc kubenswrapper[5025]: I1007 09:15:39.840265 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30ace14-d87f-428d-97c2-16f720c2e6de" containerName="collect-profiles" Oct 07 09:15:39 crc kubenswrapper[5025]: I1007 09:15:39.840429 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30ace14-d87f-428d-97c2-16f720c2e6de" containerName="collect-profiles" Oct 07 09:15:39 crc kubenswrapper[5025]: I1007 09:15:39.841499 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:39 crc kubenswrapper[5025]: I1007 09:15:39.857783 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kldq9"] Oct 07 09:15:39 crc kubenswrapper[5025]: I1007 09:15:39.906223 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc73e260-8654-4431-b0dc-6d0347778384-catalog-content\") pod \"redhat-operators-kldq9\" (UID: \"dc73e260-8654-4431-b0dc-6d0347778384\") " pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:39 crc kubenswrapper[5025]: I1007 09:15:39.906279 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8fn\" (UniqueName: \"kubernetes.io/projected/dc73e260-8654-4431-b0dc-6d0347778384-kube-api-access-mw8fn\") pod \"redhat-operators-kldq9\" (UID: \"dc73e260-8654-4431-b0dc-6d0347778384\") " pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:39 crc kubenswrapper[5025]: I1007 09:15:39.906309 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc73e260-8654-4431-b0dc-6d0347778384-utilities\") pod \"redhat-operators-kldq9\" (UID: \"dc73e260-8654-4431-b0dc-6d0347778384\") " pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.008083 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc73e260-8654-4431-b0dc-6d0347778384-catalog-content\") pod \"redhat-operators-kldq9\" (UID: \"dc73e260-8654-4431-b0dc-6d0347778384\") " pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.008147 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw8fn\" (UniqueName: \"kubernetes.io/projected/dc73e260-8654-4431-b0dc-6d0347778384-kube-api-access-mw8fn\") pod \"redhat-operators-kldq9\" (UID: \"dc73e260-8654-4431-b0dc-6d0347778384\") " pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.008198 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc73e260-8654-4431-b0dc-6d0347778384-utilities\") pod \"redhat-operators-kldq9\" (UID: \"dc73e260-8654-4431-b0dc-6d0347778384\") " pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.008717 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc73e260-8654-4431-b0dc-6d0347778384-catalog-content\") pod \"redhat-operators-kldq9\" (UID: \"dc73e260-8654-4431-b0dc-6d0347778384\") " pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.008855 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc73e260-8654-4431-b0dc-6d0347778384-utilities\") pod \"redhat-operators-kldq9\" (UID: \"dc73e260-8654-4431-b0dc-6d0347778384\") " pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.028994 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw8fn\" (UniqueName: \"kubernetes.io/projected/dc73e260-8654-4431-b0dc-6d0347778384-kube-api-access-mw8fn\") pod \"redhat-operators-kldq9\" (UID: \"dc73e260-8654-4431-b0dc-6d0347778384\") " pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.159041 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.576192 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kldq9"] Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.981758 5025 generic.go:334] "Generic (PLEG): container finished" podID="dc73e260-8654-4431-b0dc-6d0347778384" containerID="15f263d6b19cf4f3c20e9ed41b2c42e80f5cefb0c64c61ec6640b2eebbcbe280" exitCode=0 Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.981809 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kldq9" event={"ID":"dc73e260-8654-4431-b0dc-6d0347778384","Type":"ContainerDied","Data":"15f263d6b19cf4f3c20e9ed41b2c42e80f5cefb0c64c61ec6640b2eebbcbe280"} Oct 07 09:15:40 crc kubenswrapper[5025]: I1007 09:15:40.981860 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kldq9" event={"ID":"dc73e260-8654-4431-b0dc-6d0347778384","Type":"ContainerStarted","Data":"7d28e25d54226dc771113a67bfb3f730e5b30a7b096f4a99528849983e89f247"} Oct 07 09:15:48 crc kubenswrapper[5025]: I1007 09:15:48.040090 5025 generic.go:334] "Generic (PLEG): container finished" podID="dc73e260-8654-4431-b0dc-6d0347778384" containerID="529660d911deee0c3650b1ff15495a6f7a0605cb4c4cec1d420ea1a5e241200b" exitCode=0 Oct 07 09:15:48 crc kubenswrapper[5025]: I1007 09:15:48.040175 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kldq9" event={"ID":"dc73e260-8654-4431-b0dc-6d0347778384","Type":"ContainerDied","Data":"529660d911deee0c3650b1ff15495a6f7a0605cb4c4cec1d420ea1a5e241200b"} Oct 07 09:15:49 crc kubenswrapper[5025]: I1007 09:15:49.053400 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kldq9" event={"ID":"dc73e260-8654-4431-b0dc-6d0347778384","Type":"ContainerStarted","Data":"2829c8fe28e57f11269b841a4c571cc797926848dcaad83d3f4d814936a9bfe2"} Oct 07 09:15:49 crc kubenswrapper[5025]: I1007 09:15:49.077921 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kldq9" podStartSLOduration=2.534091599 podStartE2EDuration="10.077891874s" podCreationTimestamp="2025-10-07 09:15:39 +0000 UTC" firstStartedPulling="2025-10-07 09:15:40.983343707 +0000 UTC m=+3547.792657851" lastFinishedPulling="2025-10-07 09:15:48.527143982 +0000 UTC m=+3555.336458126" observedRunningTime="2025-10-07 09:15:49.071873393 +0000 UTC m=+3555.881187547" watchObservedRunningTime="2025-10-07 09:15:49.077891874 +0000 UTC m=+3555.887206018" Oct 07 09:15:50 crc kubenswrapper[5025]: I1007 09:15:50.159074 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:50 crc kubenswrapper[5025]: I1007 09:15:50.159137 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:15:51 crc kubenswrapper[5025]: I1007 09:15:51.201761 5025 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kldq9" podUID="dc73e260-8654-4431-b0dc-6d0347778384" containerName="registry-server" probeResult="failure" output=< Oct 07 09:15:51 crc kubenswrapper[5025]: timeout: failed to connect service ":50051" within 1s Oct 07 09:15:51 crc kubenswrapper[5025]: > Oct 07 09:15:55 crc kubenswrapper[5025]: I1007 09:15:55.934325 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:15:55 crc kubenswrapper[5025]: I1007 09:15:55.934783 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:16:00 crc kubenswrapper[5025]: I1007 09:16:00.199426 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:16:00 crc kubenswrapper[5025]: I1007 09:16:00.246149 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kldq9" Oct 07 09:16:00 crc kubenswrapper[5025]: I1007 09:16:00.300287 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kldq9"] Oct 07 09:16:00 crc kubenswrapper[5025]: I1007 09:16:00.432611 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng87c"] Oct 07 09:16:00 crc kubenswrapper[5025]: I1007 09:16:00.432846 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ng87c" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerName="registry-server" containerID="cri-o://ccaeec9b9484abdc4b19621312fa623e7672c2ac2513ce1aff01bbd14d63ce93" gracePeriod=2 Oct 07 09:16:01 crc kubenswrapper[5025]: I1007 09:16:01.161652 5025 generic.go:334] "Generic (PLEG): container finished" podID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerID="ccaeec9b9484abdc4b19621312fa623e7672c2ac2513ce1aff01bbd14d63ce93" exitCode=0 Oct 07 09:16:01 crc kubenswrapper[5025]: I1007 09:16:01.161709 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng87c" event={"ID":"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54","Type":"ContainerDied","Data":"ccaeec9b9484abdc4b19621312fa623e7672c2ac2513ce1aff01bbd14d63ce93"} Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.183989 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng87c" event={"ID":"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54","Type":"ContainerDied","Data":"7f19d332ed0d3f22f34f240b5200d5ab14107de45cca0d4c4c2371c9c291a464"} Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.184599 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f19d332ed0d3f22f34f240b5200d5ab14107de45cca0d4c4c2371c9c291a464" Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.186985 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.271248 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-utilities\") pod \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.271324 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-catalog-content\") pod \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.271377 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hkl8\" (UniqueName: \"kubernetes.io/projected/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-kube-api-access-9hkl8\") pod \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\" (UID: \"591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54\") " Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.272431 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-utilities" (OuterVolumeSpecName: "utilities") pod "591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" (UID: "591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.280633 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-kube-api-access-9hkl8" (OuterVolumeSpecName: "kube-api-access-9hkl8") pod "591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" (UID: "591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54"). InnerVolumeSpecName "kube-api-access-9hkl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.373211 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hkl8\" (UniqueName: \"kubernetes.io/projected/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-kube-api-access-9hkl8\") on node \"crc\" DevicePath \"\"" Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.373247 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.388093 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" (UID: "591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:16:03 crc kubenswrapper[5025]: I1007 09:16:03.474197 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:16:04 crc kubenswrapper[5025]: I1007 09:16:04.191784 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng87c" Oct 07 09:16:04 crc kubenswrapper[5025]: I1007 09:16:04.222293 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng87c"] Oct 07 09:16:04 crc kubenswrapper[5025]: I1007 09:16:04.230012 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ng87c"] Oct 07 09:16:05 crc kubenswrapper[5025]: I1007 09:16:05.932212 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" path="/var/lib/kubelet/pods/591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54/volumes" Oct 07 09:16:25 crc kubenswrapper[5025]: I1007 09:16:25.934136 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:16:25 crc kubenswrapper[5025]: I1007 09:16:25.934834 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.236614 5025 scope.go:117] "RemoveContainer" containerID="ccaeec9b9484abdc4b19621312fa623e7672c2ac2513ce1aff01bbd14d63ce93" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.269863 5025 scope.go:117] "RemoveContainer" containerID="6732473498fb506b3400b1d03a9558bfc813a268ebbdb3d50f0002d389831a5a" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.305659 5025 scope.go:117] "RemoveContainer" containerID="fc2b81f78a7371914005a1f32077998967154df30c435e665bb3233bb027e28c" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.898196 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gdvqs"] Oct 07 09:16:39 crc kubenswrapper[5025]: E1007 09:16:39.898973 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerName="registry-server" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.898992 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerName="registry-server" Oct 07 09:16:39 crc kubenswrapper[5025]: E1007 09:16:39.899016 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerName="extract-utilities" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.899025 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerName="extract-utilities" Oct 07 09:16:39 crc kubenswrapper[5025]: E1007 09:16:39.899064 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerName="extract-content" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.899073 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerName="extract-content" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.899239 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="591e97f7-f7ba-4c7c-9b8d-ce7c315a1f54" containerName="registry-server" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.900760 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.908843 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdvqs"] Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.976103 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-catalog-content\") pod \"community-operators-gdvqs\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.976454 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-utilities\") pod \"community-operators-gdvqs\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:39 crc kubenswrapper[5025]: I1007 09:16:39.976651 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzhlz\" (UniqueName: \"kubernetes.io/projected/c8b4b325-1b47-489f-ae6d-239147a5465a-kube-api-access-bzhlz\") pod \"community-operators-gdvqs\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:40 crc kubenswrapper[5025]: I1007 09:16:40.078243 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzhlz\" (UniqueName: \"kubernetes.io/projected/c8b4b325-1b47-489f-ae6d-239147a5465a-kube-api-access-bzhlz\") pod \"community-operators-gdvqs\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:40 crc kubenswrapper[5025]: I1007 09:16:40.078528 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-catalog-content\") pod \"community-operators-gdvqs\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:40 crc kubenswrapper[5025]: I1007 09:16:40.078740 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-utilities\") pod \"community-operators-gdvqs\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:40 crc kubenswrapper[5025]: I1007 09:16:40.079378 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-catalog-content\") pod \"community-operators-gdvqs\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:40 crc kubenswrapper[5025]: I1007 09:16:40.079517 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-utilities\") pod \"community-operators-gdvqs\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:40 crc kubenswrapper[5025]: I1007 09:16:40.102237 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzhlz\" (UniqueName: \"kubernetes.io/projected/c8b4b325-1b47-489f-ae6d-239147a5465a-kube-api-access-bzhlz\") pod \"community-operators-gdvqs\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:40 crc kubenswrapper[5025]: I1007 09:16:40.235847 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:40 crc kubenswrapper[5025]: I1007 09:16:40.729986 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdvqs"] Oct 07 09:16:41 crc kubenswrapper[5025]: I1007 09:16:41.524947 5025 generic.go:334] "Generic (PLEG): container finished" podID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerID="94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a" exitCode=0 Oct 07 09:16:41 crc kubenswrapper[5025]: I1007 09:16:41.525017 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdvqs" event={"ID":"c8b4b325-1b47-489f-ae6d-239147a5465a","Type":"ContainerDied","Data":"94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a"} Oct 07 09:16:41 crc kubenswrapper[5025]: I1007 09:16:41.525360 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdvqs" event={"ID":"c8b4b325-1b47-489f-ae6d-239147a5465a","Type":"ContainerStarted","Data":"cf9f27f560e587d4392af73af2d5ace973734db22445ea8b3561bc999ca83b17"} Oct 07 09:16:41 crc kubenswrapper[5025]: I1007 09:16:41.528811 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 09:16:42 crc kubenswrapper[5025]: I1007 09:16:42.542832 5025 generic.go:334] "Generic (PLEG): container finished" podID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerID="71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a" exitCode=0 Oct 07 09:16:42 crc kubenswrapper[5025]: I1007 09:16:42.542911 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdvqs" event={"ID":"c8b4b325-1b47-489f-ae6d-239147a5465a","Type":"ContainerDied","Data":"71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a"} Oct 07 09:16:43 crc kubenswrapper[5025]: I1007 09:16:43.551709 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdvqs" event={"ID":"c8b4b325-1b47-489f-ae6d-239147a5465a","Type":"ContainerStarted","Data":"d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0"} Oct 07 09:16:43 crc kubenswrapper[5025]: I1007 09:16:43.576286 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gdvqs" podStartSLOduration=3.139556764 podStartE2EDuration="4.576269152s" podCreationTimestamp="2025-10-07 09:16:39 +0000 UTC" firstStartedPulling="2025-10-07 09:16:41.528467459 +0000 UTC m=+3608.337781613" lastFinishedPulling="2025-10-07 09:16:42.965179847 +0000 UTC m=+3609.774494001" observedRunningTime="2025-10-07 09:16:43.572303827 +0000 UTC m=+3610.381618001" watchObservedRunningTime="2025-10-07 09:16:43.576269152 +0000 UTC m=+3610.385583296" Oct 07 09:16:50 crc kubenswrapper[5025]: I1007 09:16:50.236668 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:50 crc kubenswrapper[5025]: I1007 09:16:50.237359 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:50 crc kubenswrapper[5025]: I1007 09:16:50.290034 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:50 crc kubenswrapper[5025]: I1007 09:16:50.669967 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:50 crc kubenswrapper[5025]: I1007 09:16:50.725826 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gdvqs"] Oct 07 09:16:52 crc kubenswrapper[5025]: I1007 09:16:52.646334 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gdvqs" podUID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerName="registry-server" containerID="cri-o://d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0" gracePeriod=2 Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.098413 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.189228 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzhlz\" (UniqueName: \"kubernetes.io/projected/c8b4b325-1b47-489f-ae6d-239147a5465a-kube-api-access-bzhlz\") pod \"c8b4b325-1b47-489f-ae6d-239147a5465a\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.189922 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-catalog-content\") pod \"c8b4b325-1b47-489f-ae6d-239147a5465a\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.189952 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-utilities\") pod \"c8b4b325-1b47-489f-ae6d-239147a5465a\" (UID: \"c8b4b325-1b47-489f-ae6d-239147a5465a\") " Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.191280 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-utilities" (OuterVolumeSpecName: "utilities") pod "c8b4b325-1b47-489f-ae6d-239147a5465a" (UID: "c8b4b325-1b47-489f-ae6d-239147a5465a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.200272 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b4b325-1b47-489f-ae6d-239147a5465a-kube-api-access-bzhlz" (OuterVolumeSpecName: "kube-api-access-bzhlz") pod "c8b4b325-1b47-489f-ae6d-239147a5465a" (UID: "c8b4b325-1b47-489f-ae6d-239147a5465a"). InnerVolumeSpecName "kube-api-access-bzhlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.250349 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8b4b325-1b47-489f-ae6d-239147a5465a" (UID: "c8b4b325-1b47-489f-ae6d-239147a5465a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.292656 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.292741 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b4b325-1b47-489f-ae6d-239147a5465a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.292764 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzhlz\" (UniqueName: \"kubernetes.io/projected/c8b4b325-1b47-489f-ae6d-239147a5465a-kube-api-access-bzhlz\") on node \"crc\" DevicePath \"\"" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.660532 5025 generic.go:334] "Generic (PLEG): container finished" podID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerID="d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0" exitCode=0 Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.660606 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdvqs" event={"ID":"c8b4b325-1b47-489f-ae6d-239147a5465a","Type":"ContainerDied","Data":"d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0"} Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.661699 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdvqs" event={"ID":"c8b4b325-1b47-489f-ae6d-239147a5465a","Type":"ContainerDied","Data":"cf9f27f560e587d4392af73af2d5ace973734db22445ea8b3561bc999ca83b17"} Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.661737 5025 scope.go:117] "RemoveContainer" containerID="d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.660621 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdvqs" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.708499 5025 scope.go:117] "RemoveContainer" containerID="71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.722693 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gdvqs"] Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.734795 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gdvqs"] Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.747698 5025 scope.go:117] "RemoveContainer" containerID="94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.767453 5025 scope.go:117] "RemoveContainer" containerID="d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0" Oct 07 09:16:53 crc kubenswrapper[5025]: E1007 09:16:53.767973 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0\": container with ID starting with d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0 not found: ID does not exist" containerID="d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.768015 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0"} err="failed to get container status \"d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0\": rpc error: code = NotFound desc = could not find container \"d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0\": container with ID starting with d2c8a53bc6fe69ee7ba519f06deb0d8baa7fb0cc7cd16ac031d453024a522ad0 not found: ID does not exist" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.768039 5025 scope.go:117] "RemoveContainer" containerID="71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a" Oct 07 09:16:53 crc kubenswrapper[5025]: E1007 09:16:53.768412 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a\": container with ID starting with 71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a not found: ID does not exist" containerID="71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.768449 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a"} err="failed to get container status \"71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a\": rpc error: code = NotFound desc = could not find container \"71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a\": container with ID starting with 71534c06c8f83e601ad7662c8869d1287a64697a40443d19bca3d75738ccd81a not found: ID does not exist" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.768485 5025 scope.go:117] "RemoveContainer" containerID="94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a" Oct 07 09:16:53 crc kubenswrapper[5025]: E1007 09:16:53.768980 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a\": container with ID starting with 94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a not found: ID does not exist" containerID="94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.769023 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a"} err="failed to get container status \"94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a\": rpc error: code = NotFound desc = could not find container \"94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a\": container with ID starting with 94d744bf33c20860450437ae74ec74952e2928adf1cd58edd98ffa1e706dc16a not found: ID does not exist" Oct 07 09:16:53 crc kubenswrapper[5025]: I1007 09:16:53.927265 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b4b325-1b47-489f-ae6d-239147a5465a" path="/var/lib/kubelet/pods/c8b4b325-1b47-489f-ae6d-239147a5465a/volumes" Oct 07 09:16:55 crc kubenswrapper[5025]: I1007 09:16:55.934691 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:16:55 crc kubenswrapper[5025]: I1007 09:16:55.935096 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:16:55 crc kubenswrapper[5025]: I1007 09:16:55.935156 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 09:16:55 crc kubenswrapper[5025]: I1007 09:16:55.935891 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5abd8f3ab0bf8a06bd24a57352f13563a3f5b1e22b0a7f93ba506beef5c9513"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 09:16:55 crc kubenswrapper[5025]: I1007 09:16:55.935954 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://b5abd8f3ab0bf8a06bd24a57352f13563a3f5b1e22b0a7f93ba506beef5c9513" gracePeriod=600 Oct 07 09:16:56 crc kubenswrapper[5025]: I1007 09:16:56.693257 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="b5abd8f3ab0bf8a06bd24a57352f13563a3f5b1e22b0a7f93ba506beef5c9513" exitCode=0 Oct 07 09:16:56 crc kubenswrapper[5025]: I1007 09:16:56.693294 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"b5abd8f3ab0bf8a06bd24a57352f13563a3f5b1e22b0a7f93ba506beef5c9513"} Oct 07 09:16:56 crc kubenswrapper[5025]: I1007 09:16:56.693352 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b"} Oct 07 09:16:56 crc kubenswrapper[5025]: I1007 09:16:56.693378 5025 scope.go:117] "RemoveContainer" containerID="cd39ce4892b6706e193e216b5c562971ededa4bd53a3ddd3b48f8014398f9f87" Oct 07 09:19:25 crc kubenswrapper[5025]: I1007 09:19:25.934702 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:19:25 crc kubenswrapper[5025]: I1007 09:19:25.936436 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:19:55 crc kubenswrapper[5025]: I1007 09:19:55.934657 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:19:55 crc kubenswrapper[5025]: I1007 09:19:55.935392 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:20:25 crc kubenswrapper[5025]: I1007 09:20:25.934394 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:20:25 crc kubenswrapper[5025]: I1007 09:20:25.935154 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:20:25 crc kubenswrapper[5025]: I1007 09:20:25.935230 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 09:20:25 crc kubenswrapper[5025]: I1007 09:20:25.936071 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 09:20:25 crc kubenswrapper[5025]: I1007 09:20:25.936151 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" gracePeriod=600 Oct 07 09:20:26 crc kubenswrapper[5025]: E1007 09:20:26.080743 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:20:26 crc kubenswrapper[5025]: I1007 09:20:26.617493 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" exitCode=0 Oct 07 09:20:26 crc kubenswrapper[5025]: I1007 09:20:26.617566 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b"} Oct 07 09:20:26 crc kubenswrapper[5025]: I1007 09:20:26.617615 5025 scope.go:117] "RemoveContainer" containerID="b5abd8f3ab0bf8a06bd24a57352f13563a3f5b1e22b0a7f93ba506beef5c9513" Oct 07 09:20:26 crc kubenswrapper[5025]: I1007 09:20:26.618343 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:20:26 crc kubenswrapper[5025]: E1007 09:20:26.618665 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:20:37 crc kubenswrapper[5025]: I1007 09:20:37.915101 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:20:37 crc kubenswrapper[5025]: E1007 09:20:37.917348 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:20:51 crc kubenswrapper[5025]: I1007 09:20:51.914905 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:20:51 crc kubenswrapper[5025]: E1007 09:20:51.917360 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:21:03 crc kubenswrapper[5025]: I1007 09:21:03.921923 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:21:03 crc kubenswrapper[5025]: E1007 09:21:03.923108 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:21:16 crc kubenswrapper[5025]: I1007 09:21:16.915028 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:21:16 crc kubenswrapper[5025]: E1007 09:21:16.916222 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:21:29 crc kubenswrapper[5025]: I1007 09:21:29.914243 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:21:29 crc kubenswrapper[5025]: E1007 09:21:29.916650 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:21:42 crc kubenswrapper[5025]: I1007 09:21:42.915370 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:21:42 crc kubenswrapper[5025]: E1007 09:21:42.916315 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:21:56 crc kubenswrapper[5025]: I1007 09:21:56.914714 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:21:56 crc kubenswrapper[5025]: E1007 09:21:56.915815 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:22:11 crc kubenswrapper[5025]: I1007 09:22:11.915715 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:22:11 crc kubenswrapper[5025]: E1007 09:22:11.916467 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:22:25 crc kubenswrapper[5025]: I1007 09:22:25.915066 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:22:25 crc kubenswrapper[5025]: E1007 09:22:25.916184 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:22:38 crc kubenswrapper[5025]: I1007 09:22:38.915778 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:22:38 crc kubenswrapper[5025]: E1007 09:22:38.917120 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:22:52 crc kubenswrapper[5025]: I1007 09:22:52.915362 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:22:52 crc kubenswrapper[5025]: E1007 09:22:52.916393 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.226851 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j6rc5"] Oct 07 09:22:56 crc kubenswrapper[5025]: E1007 09:22:56.227362 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerName="registry-server" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.227383 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerName="registry-server" Oct 07 09:22:56 crc kubenswrapper[5025]: E1007 09:22:56.227412 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerName="extract-content" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.227421 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerName="extract-content" Oct 07 09:22:56 crc kubenswrapper[5025]: E1007 09:22:56.227442 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerName="extract-utilities" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.227449 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerName="extract-utilities" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.227628 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b4b325-1b47-489f-ae6d-239147a5465a" containerName="registry-server" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.229161 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.244191 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j6rc5"] Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.373259 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7c5ec1-fa50-4cf4-a416-233a0efd845e-catalog-content\") pod \"certified-operators-j6rc5\" (UID: \"7c7c5ec1-fa50-4cf4-a416-233a0efd845e\") " pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.373726 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7c5ec1-fa50-4cf4-a416-233a0efd845e-utilities\") pod \"certified-operators-j6rc5\" (UID: \"7c7c5ec1-fa50-4cf4-a416-233a0efd845e\") " pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.373755 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fl78\" (UniqueName: \"kubernetes.io/projected/7c7c5ec1-fa50-4cf4-a416-233a0efd845e-kube-api-access-5fl78\") pod \"certified-operators-j6rc5\" (UID: \"7c7c5ec1-fa50-4cf4-a416-233a0efd845e\") " pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.475600 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7c5ec1-fa50-4cf4-a416-233a0efd845e-catalog-content\") pod \"certified-operators-j6rc5\" (UID: \"7c7c5ec1-fa50-4cf4-a416-233a0efd845e\") " pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.475675 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7c5ec1-fa50-4cf4-a416-233a0efd845e-utilities\") pod \"certified-operators-j6rc5\" (UID: \"7c7c5ec1-fa50-4cf4-a416-233a0efd845e\") " pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.475718 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fl78\" (UniqueName: \"kubernetes.io/projected/7c7c5ec1-fa50-4cf4-a416-233a0efd845e-kube-api-access-5fl78\") pod \"certified-operators-j6rc5\" (UID: \"7c7c5ec1-fa50-4cf4-a416-233a0efd845e\") " pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.476388 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7c5ec1-fa50-4cf4-a416-233a0efd845e-utilities\") pod \"certified-operators-j6rc5\" (UID: \"7c7c5ec1-fa50-4cf4-a416-233a0efd845e\") " pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.476388 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7c5ec1-fa50-4cf4-a416-233a0efd845e-catalog-content\") pod \"certified-operators-j6rc5\" (UID: \"7c7c5ec1-fa50-4cf4-a416-233a0efd845e\") " pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.499506 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fl78\" (UniqueName: \"kubernetes.io/projected/7c7c5ec1-fa50-4cf4-a416-233a0efd845e-kube-api-access-5fl78\") pod \"certified-operators-j6rc5\" (UID: \"7c7c5ec1-fa50-4cf4-a416-233a0efd845e\") " pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:56 crc kubenswrapper[5025]: I1007 09:22:56.551056 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:22:57 crc kubenswrapper[5025]: I1007 09:22:57.093520 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j6rc5"] Oct 07 09:22:57 crc kubenswrapper[5025]: I1007 09:22:57.903941 5025 generic.go:334] "Generic (PLEG): container finished" podID="7c7c5ec1-fa50-4cf4-a416-233a0efd845e" containerID="cde0e87ccc3c7ad55693c13642db72ababad220faf4e2b4e562b0231e20c730b" exitCode=0 Oct 07 09:22:57 crc kubenswrapper[5025]: I1007 09:22:57.904102 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6rc5" event={"ID":"7c7c5ec1-fa50-4cf4-a416-233a0efd845e","Type":"ContainerDied","Data":"cde0e87ccc3c7ad55693c13642db72ababad220faf4e2b4e562b0231e20c730b"} Oct 07 09:22:57 crc kubenswrapper[5025]: I1007 09:22:57.908709 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6rc5" event={"ID":"7c7c5ec1-fa50-4cf4-a416-233a0efd845e","Type":"ContainerStarted","Data":"aad14ae1f86a445717229382bc0ede9e258867cc1bfcae336dc8f3e09839e508"} Oct 07 09:22:57 crc kubenswrapper[5025]: I1007 09:22:57.906916 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 09:23:02 crc kubenswrapper[5025]: I1007 09:23:02.965848 5025 generic.go:334] "Generic (PLEG): container finished" podID="7c7c5ec1-fa50-4cf4-a416-233a0efd845e" containerID="77ae1eb56737616b310b746aaf0f56f1c079c5a79651c9db579d7de319e9da7a" exitCode=0 Oct 07 09:23:02 crc kubenswrapper[5025]: I1007 09:23:02.965966 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6rc5" event={"ID":"7c7c5ec1-fa50-4cf4-a416-233a0efd845e","Type":"ContainerDied","Data":"77ae1eb56737616b310b746aaf0f56f1c079c5a79651c9db579d7de319e9da7a"} Oct 07 09:23:03 crc kubenswrapper[5025]: I1007 09:23:03.921260 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:23:03 crc kubenswrapper[5025]: E1007 09:23:03.922154 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:23:03 crc kubenswrapper[5025]: I1007 09:23:03.977881 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6rc5" event={"ID":"7c7c5ec1-fa50-4cf4-a416-233a0efd845e","Type":"ContainerStarted","Data":"5c47a72d05b4ab02afba452e5e639507776ef2e83fbde953325c563d13934c5c"} Oct 07 09:23:04 crc kubenswrapper[5025]: I1007 09:23:04.021126 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j6rc5" podStartSLOduration=2.470882597 podStartE2EDuration="8.021083757s" podCreationTimestamp="2025-10-07 09:22:56 +0000 UTC" firstStartedPulling="2025-10-07 09:22:57.906052308 +0000 UTC m=+3984.715366452" lastFinishedPulling="2025-10-07 09:23:03.456253448 +0000 UTC m=+3990.265567612" observedRunningTime="2025-10-07 09:23:04.018201697 +0000 UTC m=+3990.827515851" watchObservedRunningTime="2025-10-07 09:23:04.021083757 +0000 UTC m=+3990.830397941" Oct 07 09:23:06 crc kubenswrapper[5025]: I1007 09:23:06.551374 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:23:06 crc kubenswrapper[5025]: I1007 09:23:06.551908 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:23:06 crc kubenswrapper[5025]: I1007 09:23:06.636227 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:23:15 crc kubenswrapper[5025]: I1007 09:23:15.914187 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:23:15 crc kubenswrapper[5025]: E1007 09:23:15.915197 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:23:16 crc kubenswrapper[5025]: I1007 09:23:16.600673 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j6rc5" Oct 07 09:23:16 crc kubenswrapper[5025]: I1007 09:23:16.673095 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j6rc5"] Oct 07 09:23:16 crc kubenswrapper[5025]: I1007 09:23:16.710872 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftlxd"] Oct 07 09:23:16 crc kubenswrapper[5025]: I1007 09:23:16.711221 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ftlxd" podUID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerName="registry-server" containerID="cri-o://fd8c81f95718b3fb0fe6898dec5d19cdb2eb10d7a7af43230960090e2d85d1a1" gracePeriod=2 Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.113397 5025 generic.go:334] "Generic (PLEG): container finished" podID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerID="fd8c81f95718b3fb0fe6898dec5d19cdb2eb10d7a7af43230960090e2d85d1a1" exitCode=0 Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.113491 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftlxd" event={"ID":"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea","Type":"ContainerDied","Data":"fd8c81f95718b3fb0fe6898dec5d19cdb2eb10d7a7af43230960090e2d85d1a1"} Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.214321 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.345304 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-utilities\") pod \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.345408 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvvhh\" (UniqueName: \"kubernetes.io/projected/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-kube-api-access-hvvhh\") pod \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.345492 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-catalog-content\") pod \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\" (UID: \"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea\") " Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.346790 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-utilities" (OuterVolumeSpecName: "utilities") pod "8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" (UID: "8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.352981 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-kube-api-access-hvvhh" (OuterVolumeSpecName: "kube-api-access-hvvhh") pod "8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" (UID: "8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea"). InnerVolumeSpecName "kube-api-access-hvvhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.421310 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" (UID: "8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.447121 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.447174 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:23:17 crc kubenswrapper[5025]: I1007 09:23:17.447186 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvvhh\" (UniqueName: \"kubernetes.io/projected/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea-kube-api-access-hvvhh\") on node \"crc\" DevicePath \"\"" Oct 07 09:23:18 crc kubenswrapper[5025]: E1007 09:23:18.054720 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b5d11be_0ac1_4fa1_b74f_97bb3fd324ea.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b5d11be_0ac1_4fa1_b74f_97bb3fd324ea.slice/crio-20c52674907ee6285221673321d88b90199bac664c41f5108f3ec37cfd9c9f67\": RecentStats: unable to find data in memory cache]" Oct 07 09:23:18 crc kubenswrapper[5025]: I1007 09:23:18.129754 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftlxd" event={"ID":"8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea","Type":"ContainerDied","Data":"20c52674907ee6285221673321d88b90199bac664c41f5108f3ec37cfd9c9f67"} Oct 07 09:23:18 crc kubenswrapper[5025]: I1007 09:23:18.129877 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftlxd" Oct 07 09:23:18 crc kubenswrapper[5025]: I1007 09:23:18.130647 5025 scope.go:117] "RemoveContainer" containerID="fd8c81f95718b3fb0fe6898dec5d19cdb2eb10d7a7af43230960090e2d85d1a1" Oct 07 09:23:18 crc kubenswrapper[5025]: I1007 09:23:18.154283 5025 scope.go:117] "RemoveContainer" containerID="14486721f9ff1bd8195c2ab675cb39e24541cf9c555fea09a917d4977378616a" Oct 07 09:23:18 crc kubenswrapper[5025]: I1007 09:23:18.158985 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftlxd"] Oct 07 09:23:18 crc kubenswrapper[5025]: I1007 09:23:18.174430 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ftlxd"] Oct 07 09:23:18 crc kubenswrapper[5025]: I1007 09:23:18.186481 5025 scope.go:117] "RemoveContainer" containerID="102f13f6affdfdc95a068f812dcc74f1e4545b346e6d6cbab95bbc5da8c9a877" Oct 07 09:23:19 crc kubenswrapper[5025]: I1007 09:23:19.925261 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" path="/var/lib/kubelet/pods/8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea/volumes" Oct 07 09:23:28 crc kubenswrapper[5025]: I1007 09:23:28.914916 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:23:28 crc kubenswrapper[5025]: E1007 09:23:28.915769 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:23:39 crc kubenswrapper[5025]: I1007 09:23:39.915497 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:23:39 crc kubenswrapper[5025]: E1007 09:23:39.916765 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:23:50 crc kubenswrapper[5025]: I1007 09:23:50.914930 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:23:50 crc kubenswrapper[5025]: E1007 09:23:50.916129 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:24:04 crc kubenswrapper[5025]: I1007 09:24:04.915705 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:24:04 crc kubenswrapper[5025]: E1007 09:24:04.916858 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:24:19 crc kubenswrapper[5025]: I1007 09:24:19.914562 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:24:19 crc kubenswrapper[5025]: E1007 09:24:19.915470 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:24:31 crc kubenswrapper[5025]: I1007 09:24:31.915982 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:24:31 crc kubenswrapper[5025]: E1007 09:24:31.916911 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.494823 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j5ksl"] Oct 07 09:24:42 crc kubenswrapper[5025]: E1007 09:24:42.496124 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerName="extract-utilities" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.496145 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerName="extract-utilities" Oct 07 09:24:42 crc kubenswrapper[5025]: E1007 09:24:42.496164 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerName="extract-content" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.496171 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerName="extract-content" Oct 07 09:24:42 crc kubenswrapper[5025]: E1007 09:24:42.496184 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerName="registry-server" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.496192 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerName="registry-server" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.496431 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5d11be-0ac1-4fa1-b74f-97bb3fd324ea" containerName="registry-server" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.497832 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.512417 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5ksl"] Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.589148 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-utilities\") pod \"redhat-marketplace-j5ksl\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.589219 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-catalog-content\") pod \"redhat-marketplace-j5ksl\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.589277 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mxq\" (UniqueName: \"kubernetes.io/projected/f68d13df-1e1b-49aa-bfc7-7dee911f4400-kube-api-access-55mxq\") pod \"redhat-marketplace-j5ksl\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.691896 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-utilities\") pod \"redhat-marketplace-j5ksl\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.691999 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-catalog-content\") pod \"redhat-marketplace-j5ksl\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.692073 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55mxq\" (UniqueName: \"kubernetes.io/projected/f68d13df-1e1b-49aa-bfc7-7dee911f4400-kube-api-access-55mxq\") pod \"redhat-marketplace-j5ksl\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.692806 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-utilities\") pod \"redhat-marketplace-j5ksl\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.693184 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-catalog-content\") pod \"redhat-marketplace-j5ksl\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.718194 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mxq\" (UniqueName: \"kubernetes.io/projected/f68d13df-1e1b-49aa-bfc7-7dee911f4400-kube-api-access-55mxq\") pod \"redhat-marketplace-j5ksl\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:42 crc kubenswrapper[5025]: I1007 09:24:42.820215 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:43 crc kubenswrapper[5025]: I1007 09:24:43.101589 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5ksl"] Oct 07 09:24:43 crc kubenswrapper[5025]: I1007 09:24:43.899209 5025 generic.go:334] "Generic (PLEG): container finished" podID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerID="21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a" exitCode=0 Oct 07 09:24:43 crc kubenswrapper[5025]: I1007 09:24:43.899320 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5ksl" event={"ID":"f68d13df-1e1b-49aa-bfc7-7dee911f4400","Type":"ContainerDied","Data":"21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a"} Oct 07 09:24:43 crc kubenswrapper[5025]: I1007 09:24:43.899645 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5ksl" event={"ID":"f68d13df-1e1b-49aa-bfc7-7dee911f4400","Type":"ContainerStarted","Data":"69294f279b111508f705bdd3919305d016413b6aacfc24b1abb8e6fcea9fbd5a"} Oct 07 09:24:43 crc kubenswrapper[5025]: I1007 09:24:43.915399 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:24:43 crc kubenswrapper[5025]: E1007 09:24:43.915699 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:24:45 crc kubenswrapper[5025]: I1007 09:24:45.929803 5025 generic.go:334] "Generic (PLEG): container finished" podID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerID="d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb" exitCode=0 Oct 07 09:24:45 crc kubenswrapper[5025]: I1007 09:24:45.929921 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5ksl" event={"ID":"f68d13df-1e1b-49aa-bfc7-7dee911f4400","Type":"ContainerDied","Data":"d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb"} Oct 07 09:24:46 crc kubenswrapper[5025]: I1007 09:24:46.941951 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5ksl" event={"ID":"f68d13df-1e1b-49aa-bfc7-7dee911f4400","Type":"ContainerStarted","Data":"5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177"} Oct 07 09:24:46 crc kubenswrapper[5025]: I1007 09:24:46.968261 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j5ksl" podStartSLOduration=2.495559701 podStartE2EDuration="4.968232215s" podCreationTimestamp="2025-10-07 09:24:42 +0000 UTC" firstStartedPulling="2025-10-07 09:24:43.901417497 +0000 UTC m=+4090.710731641" lastFinishedPulling="2025-10-07 09:24:46.374090021 +0000 UTC m=+4093.183404155" observedRunningTime="2025-10-07 09:24:46.963591929 +0000 UTC m=+4093.772906073" watchObservedRunningTime="2025-10-07 09:24:46.968232215 +0000 UTC m=+4093.777546349" Oct 07 09:24:52 crc kubenswrapper[5025]: I1007 09:24:52.820791 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:52 crc kubenswrapper[5025]: I1007 09:24:52.821625 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:52 crc kubenswrapper[5025]: I1007 09:24:52.879029 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:53 crc kubenswrapper[5025]: I1007 09:24:53.038430 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:53 crc kubenswrapper[5025]: I1007 09:24:53.138354 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5ksl"] Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.009063 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j5ksl" podUID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerName="registry-server" containerID="cri-o://5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177" gracePeriod=2 Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.441980 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.516134 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-utilities\") pod \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.516231 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-catalog-content\") pod \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.516280 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55mxq\" (UniqueName: \"kubernetes.io/projected/f68d13df-1e1b-49aa-bfc7-7dee911f4400-kube-api-access-55mxq\") pod \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\" (UID: \"f68d13df-1e1b-49aa-bfc7-7dee911f4400\") " Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.517488 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-utilities" (OuterVolumeSpecName: "utilities") pod "f68d13df-1e1b-49aa-bfc7-7dee911f4400" (UID: "f68d13df-1e1b-49aa-bfc7-7dee911f4400"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.526093 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68d13df-1e1b-49aa-bfc7-7dee911f4400-kube-api-access-55mxq" (OuterVolumeSpecName: "kube-api-access-55mxq") pod "f68d13df-1e1b-49aa-bfc7-7dee911f4400" (UID: "f68d13df-1e1b-49aa-bfc7-7dee911f4400"). InnerVolumeSpecName "kube-api-access-55mxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.606519 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f68d13df-1e1b-49aa-bfc7-7dee911f4400" (UID: "f68d13df-1e1b-49aa-bfc7-7dee911f4400"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.618353 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.618399 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68d13df-1e1b-49aa-bfc7-7dee911f4400-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:24:55 crc kubenswrapper[5025]: I1007 09:24:55.618411 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55mxq\" (UniqueName: \"kubernetes.io/projected/f68d13df-1e1b-49aa-bfc7-7dee911f4400-kube-api-access-55mxq\") on node \"crc\" DevicePath \"\"" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.038697 5025 generic.go:334] "Generic (PLEG): container finished" podID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerID="5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177" exitCode=0 Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.038757 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5ksl" event={"ID":"f68d13df-1e1b-49aa-bfc7-7dee911f4400","Type":"ContainerDied","Data":"5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177"} Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.039253 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5ksl" event={"ID":"f68d13df-1e1b-49aa-bfc7-7dee911f4400","Type":"ContainerDied","Data":"69294f279b111508f705bdd3919305d016413b6aacfc24b1abb8e6fcea9fbd5a"} Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.038814 5025 scope.go:117] "RemoveContainer" containerID="5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.039472 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5ksl" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.081631 5025 scope.go:117] "RemoveContainer" containerID="d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.087730 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5ksl"] Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.092723 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5ksl"] Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.121817 5025 scope.go:117] "RemoveContainer" containerID="21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.141095 5025 scope.go:117] "RemoveContainer" containerID="5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177" Oct 07 09:24:56 crc kubenswrapper[5025]: E1007 09:24:56.141760 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177\": container with ID starting with 5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177 not found: ID does not exist" containerID="5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.141826 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177"} err="failed to get container status \"5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177\": rpc error: code = NotFound desc = could not find container \"5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177\": container with ID starting with 5da12d9017c6a9d4e093a5654991df5f835dfc5bc39e85f435d4f84070153177 not found: ID does not exist" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.141865 5025 scope.go:117] "RemoveContainer" containerID="d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb" Oct 07 09:24:56 crc kubenswrapper[5025]: E1007 09:24:56.142262 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb\": container with ID starting with d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb not found: ID does not exist" containerID="d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.142306 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb"} err="failed to get container status \"d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb\": rpc error: code = NotFound desc = could not find container \"d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb\": container with ID starting with d8892f9f72a9caa168202d6955ccbe3cc57c755006e99bcad3ffafa2e01ed8bb not found: ID does not exist" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.142356 5025 scope.go:117] "RemoveContainer" containerID="21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a" Oct 07 09:24:56 crc kubenswrapper[5025]: E1007 09:24:56.143235 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a\": container with ID starting with 21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a not found: ID does not exist" containerID="21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a" Oct 07 09:24:56 crc kubenswrapper[5025]: I1007 09:24:56.143276 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a"} err="failed to get container status \"21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a\": rpc error: code = NotFound desc = could not find container \"21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a\": container with ID starting with 21526e35e15189d8a9622343be0095c96e5ea58b62d7a2443dc286762354d68a not found: ID does not exist" Oct 07 09:24:57 crc kubenswrapper[5025]: I1007 09:24:57.915164 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:24:57 crc kubenswrapper[5025]: E1007 09:24:57.916002 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:24:57 crc kubenswrapper[5025]: I1007 09:24:57.930132 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" path="/var/lib/kubelet/pods/f68d13df-1e1b-49aa-bfc7-7dee911f4400/volumes" Oct 07 09:25:12 crc kubenswrapper[5025]: I1007 09:25:12.915715 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:25:12 crc kubenswrapper[5025]: E1007 09:25:12.917062 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:25:24 crc kubenswrapper[5025]: I1007 09:25:24.915265 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:25:24 crc kubenswrapper[5025]: E1007 09:25:24.916341 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:25:37 crc kubenswrapper[5025]: I1007 09:25:37.915307 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:25:38 crc kubenswrapper[5025]: I1007 09:25:38.413225 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"39ee1466ed5d015cbd75b2c0cc144f31970050e10bccfa27cc9375a60cdb6572"} Oct 07 09:25:52 crc kubenswrapper[5025]: I1007 09:25:52.956118 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-58grv"] Oct 07 09:25:52 crc kubenswrapper[5025]: E1007 09:25:52.957606 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerName="extract-content" Oct 07 09:25:52 crc kubenswrapper[5025]: I1007 09:25:52.957643 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerName="extract-content" Oct 07 09:25:52 crc kubenswrapper[5025]: E1007 09:25:52.957701 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerName="registry-server" Oct 07 09:25:52 crc kubenswrapper[5025]: I1007 09:25:52.957718 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerName="registry-server" Oct 07 09:25:52 crc kubenswrapper[5025]: E1007 09:25:52.957748 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerName="extract-utilities" Oct 07 09:25:52 crc kubenswrapper[5025]: I1007 09:25:52.957765 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerName="extract-utilities" Oct 07 09:25:52 crc kubenswrapper[5025]: I1007 09:25:52.958168 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68d13df-1e1b-49aa-bfc7-7dee911f4400" containerName="registry-server" Oct 07 09:25:52 crc kubenswrapper[5025]: I1007 09:25:52.960234 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:52 crc kubenswrapper[5025]: I1007 09:25:52.974533 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-58grv"] Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.137887 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-catalog-content\") pod \"redhat-operators-58grv\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.138601 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-utilities\") pod \"redhat-operators-58grv\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.138708 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnmr2\" (UniqueName: \"kubernetes.io/projected/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-kube-api-access-gnmr2\") pod \"redhat-operators-58grv\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.240064 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnmr2\" (UniqueName: \"kubernetes.io/projected/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-kube-api-access-gnmr2\") pod \"redhat-operators-58grv\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.240176 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-catalog-content\") pod \"redhat-operators-58grv\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.240237 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-utilities\") pod \"redhat-operators-58grv\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.240887 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-catalog-content\") pod \"redhat-operators-58grv\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.240919 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-utilities\") pod \"redhat-operators-58grv\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.261660 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnmr2\" (UniqueName: \"kubernetes.io/projected/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-kube-api-access-gnmr2\") pod \"redhat-operators-58grv\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.288036 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:25:53 crc kubenswrapper[5025]: I1007 09:25:53.867114 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-58grv"] Oct 07 09:25:54 crc kubenswrapper[5025]: I1007 09:25:54.549788 5025 generic.go:334] "Generic (PLEG): container finished" podID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerID="440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9" exitCode=0 Oct 07 09:25:54 crc kubenswrapper[5025]: I1007 09:25:54.549856 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58grv" event={"ID":"ee133aee-5de1-4bf0-b3a9-0b76a11bf950","Type":"ContainerDied","Data":"440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9"} Oct 07 09:25:54 crc kubenswrapper[5025]: I1007 09:25:54.550118 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58grv" event={"ID":"ee133aee-5de1-4bf0-b3a9-0b76a11bf950","Type":"ContainerStarted","Data":"30bd5635bf3115fc8bac3d942c4f98a201fd2a1c975bc3c1aea28844afe9dc5e"} Oct 07 09:25:55 crc kubenswrapper[5025]: I1007 09:25:55.560517 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58grv" event={"ID":"ee133aee-5de1-4bf0-b3a9-0b76a11bf950","Type":"ContainerStarted","Data":"9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115"} Oct 07 09:25:56 crc kubenswrapper[5025]: I1007 09:25:56.569899 5025 generic.go:334] "Generic (PLEG): container finished" podID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerID="9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115" exitCode=0 Oct 07 09:25:56 crc kubenswrapper[5025]: I1007 09:25:56.569973 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58grv" event={"ID":"ee133aee-5de1-4bf0-b3a9-0b76a11bf950","Type":"ContainerDied","Data":"9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115"} Oct 07 09:25:57 crc kubenswrapper[5025]: I1007 09:25:57.580502 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58grv" event={"ID":"ee133aee-5de1-4bf0-b3a9-0b76a11bf950","Type":"ContainerStarted","Data":"0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb"} Oct 07 09:25:57 crc kubenswrapper[5025]: I1007 09:25:57.604107 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-58grv" podStartSLOduration=3.155343723 podStartE2EDuration="5.604083033s" podCreationTimestamp="2025-10-07 09:25:52 +0000 UTC" firstStartedPulling="2025-10-07 09:25:54.551787203 +0000 UTC m=+4161.361101347" lastFinishedPulling="2025-10-07 09:25:57.000526493 +0000 UTC m=+4163.809840657" observedRunningTime="2025-10-07 09:25:57.599733557 +0000 UTC m=+4164.409047721" watchObservedRunningTime="2025-10-07 09:25:57.604083033 +0000 UTC m=+4164.413397187" Oct 07 09:26:03 crc kubenswrapper[5025]: I1007 09:26:03.288915 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:26:03 crc kubenswrapper[5025]: I1007 09:26:03.289683 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:26:03 crc kubenswrapper[5025]: I1007 09:26:03.363089 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:26:03 crc kubenswrapper[5025]: I1007 09:26:03.682420 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:26:03 crc kubenswrapper[5025]: I1007 09:26:03.730439 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-58grv"] Oct 07 09:26:05 crc kubenswrapper[5025]: I1007 09:26:05.646891 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-58grv" podUID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerName="registry-server" containerID="cri-o://0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb" gracePeriod=2 Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.037416 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.173484 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnmr2\" (UniqueName: \"kubernetes.io/projected/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-kube-api-access-gnmr2\") pod \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.173640 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-catalog-content\") pod \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.173698 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-utilities\") pod \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\" (UID: \"ee133aee-5de1-4bf0-b3a9-0b76a11bf950\") " Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.175390 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-utilities" (OuterVolumeSpecName: "utilities") pod "ee133aee-5de1-4bf0-b3a9-0b76a11bf950" (UID: "ee133aee-5de1-4bf0-b3a9-0b76a11bf950"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.180222 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-kube-api-access-gnmr2" (OuterVolumeSpecName: "kube-api-access-gnmr2") pod "ee133aee-5de1-4bf0-b3a9-0b76a11bf950" (UID: "ee133aee-5de1-4bf0-b3a9-0b76a11bf950"). InnerVolumeSpecName "kube-api-access-gnmr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.276183 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnmr2\" (UniqueName: \"kubernetes.io/projected/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-kube-api-access-gnmr2\") on node \"crc\" DevicePath \"\"" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.276236 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.656906 5025 generic.go:334] "Generic (PLEG): container finished" podID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerID="0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb" exitCode=0 Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.656957 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58grv" event={"ID":"ee133aee-5de1-4bf0-b3a9-0b76a11bf950","Type":"ContainerDied","Data":"0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb"} Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.656989 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58grv" event={"ID":"ee133aee-5de1-4bf0-b3a9-0b76a11bf950","Type":"ContainerDied","Data":"30bd5635bf3115fc8bac3d942c4f98a201fd2a1c975bc3c1aea28844afe9dc5e"} Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.657009 5025 scope.go:117] "RemoveContainer" containerID="0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.657050 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58grv" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.679991 5025 scope.go:117] "RemoveContainer" containerID="9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.698997 5025 scope.go:117] "RemoveContainer" containerID="440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.736808 5025 scope.go:117] "RemoveContainer" containerID="0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb" Oct 07 09:26:06 crc kubenswrapper[5025]: E1007 09:26:06.737436 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb\": container with ID starting with 0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb not found: ID does not exist" containerID="0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.737469 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb"} err="failed to get container status \"0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb\": rpc error: code = NotFound desc = could not find container \"0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb\": container with ID starting with 0b5a8b620f26a72014058b0ff972845835b1d3b3c68bc701e3f655b317c9dbdb not found: ID does not exist" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.737505 5025 scope.go:117] "RemoveContainer" containerID="9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115" Oct 07 09:26:06 crc kubenswrapper[5025]: E1007 09:26:06.737953 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115\": container with ID starting with 9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115 not found: ID does not exist" containerID="9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.737984 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115"} err="failed to get container status \"9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115\": rpc error: code = NotFound desc = could not find container \"9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115\": container with ID starting with 9f1b8ea957872432c6710ca903a9e673efebb11b5a92eac7e1413263e4703115 not found: ID does not exist" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.738003 5025 scope.go:117] "RemoveContainer" containerID="440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9" Oct 07 09:26:06 crc kubenswrapper[5025]: E1007 09:26:06.738394 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9\": container with ID starting with 440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9 not found: ID does not exist" containerID="440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9" Oct 07 09:26:06 crc kubenswrapper[5025]: I1007 09:26:06.738450 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9"} err="failed to get container status \"440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9\": rpc error: code = NotFound desc = could not find container \"440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9\": container with ID starting with 440ca0be0b36b897ec6d447c129076ffb5e949cee30140c144ecce6b36a7d8f9 not found: ID does not exist" Oct 07 09:26:07 crc kubenswrapper[5025]: I1007 09:26:07.671819 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee133aee-5de1-4bf0-b3a9-0b76a11bf950" (UID: "ee133aee-5de1-4bf0-b3a9-0b76a11bf950"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:26:07 crc kubenswrapper[5025]: I1007 09:26:07.699993 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee133aee-5de1-4bf0-b3a9-0b76a11bf950-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:26:07 crc kubenswrapper[5025]: I1007 09:26:07.899138 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-58grv"] Oct 07 09:26:07 crc kubenswrapper[5025]: I1007 09:26:07.905759 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-58grv"] Oct 07 09:26:07 crc kubenswrapper[5025]: I1007 09:26:07.927822 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" path="/var/lib/kubelet/pods/ee133aee-5de1-4bf0-b3a9-0b76a11bf950/volumes" Oct 07 09:27:55 crc kubenswrapper[5025]: I1007 09:27:55.934015 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:27:55 crc kubenswrapper[5025]: I1007 09:27:55.934935 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:28:25 crc kubenswrapper[5025]: I1007 09:28:25.933983 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:28:25 crc kubenswrapper[5025]: I1007 09:28:25.934857 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:28:55 crc kubenswrapper[5025]: I1007 09:28:55.934832 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:28:55 crc kubenswrapper[5025]: I1007 09:28:55.935762 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:28:55 crc kubenswrapper[5025]: I1007 09:28:55.935832 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 09:28:55 crc kubenswrapper[5025]: I1007 09:28:55.937250 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39ee1466ed5d015cbd75b2c0cc144f31970050e10bccfa27cc9375a60cdb6572"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 09:28:55 crc kubenswrapper[5025]: I1007 09:28:55.937373 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://39ee1466ed5d015cbd75b2c0cc144f31970050e10bccfa27cc9375a60cdb6572" gracePeriod=600 Oct 07 09:28:56 crc kubenswrapper[5025]: I1007 09:28:56.256591 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="39ee1466ed5d015cbd75b2c0cc144f31970050e10bccfa27cc9375a60cdb6572" exitCode=0 Oct 07 09:28:56 crc kubenswrapper[5025]: I1007 09:28:56.256710 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"39ee1466ed5d015cbd75b2c0cc144f31970050e10bccfa27cc9375a60cdb6572"} Oct 07 09:28:56 crc kubenswrapper[5025]: I1007 09:28:56.257391 5025 scope.go:117] "RemoveContainer" containerID="e7ed16dc80623e565d948227f9c6f1bb967136f12f0c424382b8ded9301a6c2b" Oct 07 09:28:57 crc kubenswrapper[5025]: I1007 09:28:57.271050 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerStarted","Data":"652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e"} Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.514681 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jcjgq/must-gather-twh49"] Oct 07 09:29:48 crc kubenswrapper[5025]: E1007 09:29:48.515807 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerName="extract-utilities" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.515826 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerName="extract-utilities" Oct 07 09:29:48 crc kubenswrapper[5025]: E1007 09:29:48.515863 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerName="extract-content" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.515870 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerName="extract-content" Oct 07 09:29:48 crc kubenswrapper[5025]: E1007 09:29:48.515882 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerName="registry-server" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.515889 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerName="registry-server" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.516052 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee133aee-5de1-4bf0-b3a9-0b76a11bf950" containerName="registry-server" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.516890 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.523617 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jcjgq"/"default-dockercfg-jp2k8" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.524029 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jcjgq"/"kube-root-ca.crt" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.524227 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jcjgq"/"openshift-service-ca.crt" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.530973 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jcjgq/must-gather-twh49"] Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.668008 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-must-gather-output\") pod \"must-gather-twh49\" (UID: \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\") " pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.668503 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwww9\" (UniqueName: \"kubernetes.io/projected/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-kube-api-access-rwww9\") pod \"must-gather-twh49\" (UID: \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\") " pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.772852 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-must-gather-output\") pod \"must-gather-twh49\" (UID: \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\") " pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.772944 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwww9\" (UniqueName: \"kubernetes.io/projected/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-kube-api-access-rwww9\") pod \"must-gather-twh49\" (UID: \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\") " pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.773850 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-must-gather-output\") pod \"must-gather-twh49\" (UID: \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\") " pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.796566 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwww9\" (UniqueName: \"kubernetes.io/projected/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-kube-api-access-rwww9\") pod \"must-gather-twh49\" (UID: \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\") " pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:29:48 crc kubenswrapper[5025]: I1007 09:29:48.840153 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:29:49 crc kubenswrapper[5025]: I1007 09:29:49.325771 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jcjgq/must-gather-twh49"] Oct 07 09:29:49 crc kubenswrapper[5025]: I1007 09:29:49.336885 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 09:29:49 crc kubenswrapper[5025]: I1007 09:29:49.782904 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jcjgq/must-gather-twh49" event={"ID":"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61","Type":"ContainerStarted","Data":"e3f4bbc16c9f6acf2c857497585d6b4afcce9552ddba22e5813db91fe192e64f"} Oct 07 09:29:54 crc kubenswrapper[5025]: I1007 09:29:54.829848 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jcjgq/must-gather-twh49" event={"ID":"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61","Type":"ContainerStarted","Data":"c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80"} Oct 07 09:29:54 crc kubenswrapper[5025]: I1007 09:29:54.830729 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jcjgq/must-gather-twh49" event={"ID":"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61","Type":"ContainerStarted","Data":"5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919"} Oct 07 09:29:54 crc kubenswrapper[5025]: I1007 09:29:54.862713 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jcjgq/must-gather-twh49" podStartSLOduration=2.645309444 podStartE2EDuration="6.862677529s" podCreationTimestamp="2025-10-07 09:29:48 +0000 UTC" firstStartedPulling="2025-10-07 09:29:49.336064413 +0000 UTC m=+4396.145378567" lastFinishedPulling="2025-10-07 09:29:53.553432508 +0000 UTC m=+4400.362746652" observedRunningTime="2025-10-07 09:29:54.854990127 +0000 UTC m=+4401.664304281" watchObservedRunningTime="2025-10-07 09:29:54.862677529 +0000 UTC m=+4401.671991693" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.159569 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w"] Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.161741 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.163896 5025 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.165921 5025 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.184653 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w"] Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.281301 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac522518-8445-44fc-837f-c5a9def13a7d-config-volume\") pod \"collect-profiles-29330490-xw85w\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.282139 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac522518-8445-44fc-837f-c5a9def13a7d-secret-volume\") pod \"collect-profiles-29330490-xw85w\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.282322 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drkwc\" (UniqueName: \"kubernetes.io/projected/ac522518-8445-44fc-837f-c5a9def13a7d-kube-api-access-drkwc\") pod \"collect-profiles-29330490-xw85w\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.384116 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac522518-8445-44fc-837f-c5a9def13a7d-config-volume\") pod \"collect-profiles-29330490-xw85w\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.384177 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac522518-8445-44fc-837f-c5a9def13a7d-secret-volume\") pod \"collect-profiles-29330490-xw85w\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.384220 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drkwc\" (UniqueName: \"kubernetes.io/projected/ac522518-8445-44fc-837f-c5a9def13a7d-kube-api-access-drkwc\") pod \"collect-profiles-29330490-xw85w\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.386223 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac522518-8445-44fc-837f-c5a9def13a7d-config-volume\") pod \"collect-profiles-29330490-xw85w\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.395391 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac522518-8445-44fc-837f-c5a9def13a7d-secret-volume\") pod \"collect-profiles-29330490-xw85w\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.408303 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drkwc\" (UniqueName: \"kubernetes.io/projected/ac522518-8445-44fc-837f-c5a9def13a7d-kube-api-access-drkwc\") pod \"collect-profiles-29330490-xw85w\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:00 crc kubenswrapper[5025]: I1007 09:30:00.492908 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:01 crc kubenswrapper[5025]: I1007 09:30:01.020354 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w"] Oct 07 09:30:01 crc kubenswrapper[5025]: I1007 09:30:01.889139 5025 generic.go:334] "Generic (PLEG): container finished" podID="ac522518-8445-44fc-837f-c5a9def13a7d" containerID="7d4d214c4879ff8eaa47f03ee8ccc0c00a1d167440aaad344ef4321f31196d69" exitCode=0 Oct 07 09:30:01 crc kubenswrapper[5025]: I1007 09:30:01.889663 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" event={"ID":"ac522518-8445-44fc-837f-c5a9def13a7d","Type":"ContainerDied","Data":"7d4d214c4879ff8eaa47f03ee8ccc0c00a1d167440aaad344ef4321f31196d69"} Oct 07 09:30:01 crc kubenswrapper[5025]: I1007 09:30:01.889709 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" event={"ID":"ac522518-8445-44fc-837f-c5a9def13a7d","Type":"ContainerStarted","Data":"f856dbb21baa308f63540444c5ee81b4a85814c431919e77d88c66e2a7c58b7b"} Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.237135 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.347505 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac522518-8445-44fc-837f-c5a9def13a7d-secret-volume\") pod \"ac522518-8445-44fc-837f-c5a9def13a7d\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.347731 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drkwc\" (UniqueName: \"kubernetes.io/projected/ac522518-8445-44fc-837f-c5a9def13a7d-kube-api-access-drkwc\") pod \"ac522518-8445-44fc-837f-c5a9def13a7d\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.347803 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac522518-8445-44fc-837f-c5a9def13a7d-config-volume\") pod \"ac522518-8445-44fc-837f-c5a9def13a7d\" (UID: \"ac522518-8445-44fc-837f-c5a9def13a7d\") " Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.348889 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac522518-8445-44fc-837f-c5a9def13a7d-config-volume" (OuterVolumeSpecName: "config-volume") pod "ac522518-8445-44fc-837f-c5a9def13a7d" (UID: "ac522518-8445-44fc-837f-c5a9def13a7d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.360815 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac522518-8445-44fc-837f-c5a9def13a7d-kube-api-access-drkwc" (OuterVolumeSpecName: "kube-api-access-drkwc") pod "ac522518-8445-44fc-837f-c5a9def13a7d" (UID: "ac522518-8445-44fc-837f-c5a9def13a7d"). InnerVolumeSpecName "kube-api-access-drkwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.360868 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac522518-8445-44fc-837f-c5a9def13a7d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ac522518-8445-44fc-837f-c5a9def13a7d" (UID: "ac522518-8445-44fc-837f-c5a9def13a7d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.449862 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drkwc\" (UniqueName: \"kubernetes.io/projected/ac522518-8445-44fc-837f-c5a9def13a7d-kube-api-access-drkwc\") on node \"crc\" DevicePath \"\"" Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.449909 5025 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac522518-8445-44fc-837f-c5a9def13a7d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.449922 5025 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac522518-8445-44fc-837f-c5a9def13a7d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.908752 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" event={"ID":"ac522518-8445-44fc-837f-c5a9def13a7d","Type":"ContainerDied","Data":"f856dbb21baa308f63540444c5ee81b4a85814c431919e77d88c66e2a7c58b7b"} Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.908808 5025 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f856dbb21baa308f63540444c5ee81b4a85814c431919e77d88c66e2a7c58b7b" Oct 07 09:30:03 crc kubenswrapper[5025]: I1007 09:30:03.908878 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330490-xw85w" Oct 07 09:30:04 crc kubenswrapper[5025]: I1007 09:30:04.329601 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk"] Oct 07 09:30:04 crc kubenswrapper[5025]: I1007 09:30:04.337903 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330445-mdrrk"] Oct 07 09:30:05 crc kubenswrapper[5025]: I1007 09:30:05.924860 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1583866-07df-4b61-a0ce-4c1e8a22a9d2" path="/var/lib/kubelet/pods/a1583866-07df-4b61-a0ce-4c1e8a22a9d2/volumes" Oct 07 09:30:39 crc kubenswrapper[5025]: I1007 09:30:39.650163 5025 scope.go:117] "RemoveContainer" containerID="83a9f01c523837701666b1ae300eb68228bb9f28f77925bcd47f369a31461bc0" Oct 07 09:30:51 crc kubenswrapper[5025]: I1007 09:30:51.450843 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-p2fpt_988dc4fe-2f1a-481a-9954-3578f833387e/manager/0.log" Oct 07 09:30:51 crc kubenswrapper[5025]: I1007 09:30:51.481138 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-p2fpt_988dc4fe-2f1a-481a-9954-3578f833387e/kube-rbac-proxy/0.log" Oct 07 09:30:51 crc kubenswrapper[5025]: I1007 09:30:51.672129 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-7tfgz_c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1/kube-rbac-proxy/0.log" Oct 07 09:30:51 crc kubenswrapper[5025]: I1007 09:30:51.707457 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-7tfgz_c956fa54-5b9c-4e25-ba8d-8e0b26edc9f1/manager/0.log" Oct 07 09:30:51 crc kubenswrapper[5025]: I1007 09:30:51.884605 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7_95c596b1-3c21-4e10-a7ee-c1b6c9220ddf/util/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.036008 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7_95c596b1-3c21-4e10-a7ee-c1b6c9220ddf/util/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.101402 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7_95c596b1-3c21-4e10-a7ee-c1b6c9220ddf/pull/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.144483 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7_95c596b1-3c21-4e10-a7ee-c1b6c9220ddf/pull/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.300973 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7_95c596b1-3c21-4e10-a7ee-c1b6c9220ddf/pull/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.348265 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7_95c596b1-3c21-4e10-a7ee-c1b6c9220ddf/util/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.350992 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcef7349673c2fba947c5845438ccfb6153624e8a5d39e7e1718f94114b6vs7_95c596b1-3c21-4e10-a7ee-c1b6c9220ddf/extract/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.497949 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-75k2s_ffa40450-8658-4d21-b4b1-1174c69e989f/kube-rbac-proxy/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.536051 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-75k2s_ffa40450-8658-4d21-b4b1-1174c69e989f/manager/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.619776 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-pfd8r_f22649b0-7cea-4fb0-bc66-d1708cfa5630/kube-rbac-proxy/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.785432 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-pfd8r_f22649b0-7cea-4fb0-bc66-d1708cfa5630/manager/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.868402 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-hlvw6_6dc83248-702c-40eb-92ce-99f686ea1bfc/manager/0.log" Oct 07 09:30:52 crc kubenswrapper[5025]: I1007 09:30:52.876838 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-hlvw6_6dc83248-702c-40eb-92ce-99f686ea1bfc/kube-rbac-proxy/0.log" Oct 07 09:30:53 crc kubenswrapper[5025]: I1007 09:30:53.036880 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-crcmj_d9b1f3d1-a00f-45df-ae19-728a8716aaa3/kube-rbac-proxy/0.log" Oct 07 09:30:53 crc kubenswrapper[5025]: I1007 09:30:53.070997 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-crcmj_d9b1f3d1-a00f-45df-ae19-728a8716aaa3/manager/0.log" Oct 07 09:30:53 crc kubenswrapper[5025]: I1007 09:30:53.487571 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-t9bsq_46dec033-4c2a-4fd6-87fb-a877d35e258d/kube-rbac-proxy/0.log" Oct 07 09:30:53 crc kubenswrapper[5025]: I1007 09:30:53.507517 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-66frw_d708eb23-1cca-4c1c-a1e5-7a68efa23a59/kube-rbac-proxy/0.log" Oct 07 09:30:53 crc kubenswrapper[5025]: I1007 09:30:53.549038 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-66frw_d708eb23-1cca-4c1c-a1e5-7a68efa23a59/manager/0.log" Oct 07 09:30:53 crc kubenswrapper[5025]: I1007 09:30:53.705240 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-t9bsq_46dec033-4c2a-4fd6-87fb-a877d35e258d/manager/0.log" Oct 07 09:30:53 crc kubenswrapper[5025]: I1007 09:30:53.762001 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-2tlcp_0dc6bebc-04e7-4d9f-bf07-007411e61c71/kube-rbac-proxy/0.log" Oct 07 09:30:53 crc kubenswrapper[5025]: I1007 09:30:53.817174 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-2tlcp_0dc6bebc-04e7-4d9f-bf07-007411e61c71/manager/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.279478 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-9ch9p_ded88cfb-86e3-4bcf-875c-285b6b34776b/manager/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.282263 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-9ch9p_ded88cfb-86e3-4bcf-875c-285b6b34776b/kube-rbac-proxy/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.497635 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-92n2s_254bc245-b889-4cb4-a787-a49298e93315/manager/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.502819 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-92n2s_254bc245-b889-4cb4-a787-a49298e93315/kube-rbac-proxy/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.563473 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-94knm_a0a63794-3c14-4704-aca7-7f259b6e9292/kube-rbac-proxy/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.635186 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-94knm_a0a63794-3c14-4704-aca7-7f259b6e9292/manager/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.685786 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-jdb47_ecc025b9-0996-46f6-9ea5-024219d094b0/kube-rbac-proxy/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.822413 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-jdb47_ecc025b9-0996-46f6-9ea5-024219d094b0/manager/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.841483 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-mwl2b_43b5fc1d-13b9-4945-a93d-4c55036c69ca/kube-rbac-proxy/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.879790 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-mwl2b_43b5fc1d-13b9-4945-a93d-4c55036c69ca/manager/0.log" Oct 07 09:30:54 crc kubenswrapper[5025]: I1007 09:30:54.992003 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2_d0493142-dcb7-4291-89e2-857772df4f54/kube-rbac-proxy/0.log" Oct 07 09:30:55 crc kubenswrapper[5025]: I1007 09:30:55.019985 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cbtkt2_d0493142-dcb7-4291-89e2-857772df4f54/manager/0.log" Oct 07 09:30:55 crc kubenswrapper[5025]: I1007 09:30:55.121635 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8b6c49794-5wfpl_c06c5dfe-d9b0-4d21-a132-79ca285655c6/kube-rbac-proxy/0.log" Oct 07 09:30:55 crc kubenswrapper[5025]: I1007 09:30:55.265368 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7db4b69559-4wdwm_ca0d8394-2e0b-438e-b6d9-700a52a6f339/kube-rbac-proxy/0.log" Oct 07 09:30:55 crc kubenswrapper[5025]: I1007 09:30:55.500111 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7db4b69559-4wdwm_ca0d8394-2e0b-438e-b6d9-700a52a6f339/operator/0.log" Oct 07 09:30:55 crc kubenswrapper[5025]: I1007 09:30:55.842083 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4p5nd_882d12c5-6ce7-4f45-8703-48feed573896/registry-server/0.log" Oct 07 09:30:55 crc kubenswrapper[5025]: I1007 09:30:55.906509 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-rvd8k_6eab8249-8174-4cfd-ab17-de2ed309f0e5/kube-rbac-proxy/0.log" Oct 07 09:30:55 crc kubenswrapper[5025]: I1007 09:30:55.969704 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-rvd8k_6eab8249-8174-4cfd-ab17-de2ed309f0e5/manager/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.008045 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8b6c49794-5wfpl_c06c5dfe-d9b0-4d21-a132-79ca285655c6/manager/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.137442 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-7dj22_c423925e-5372-4cf2-a8ce-5d864fde501e/manager/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.141290 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-7dj22_c423925e-5372-4cf2-a8ce-5d864fde501e/kube-rbac-proxy/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.248680 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-b4lx7_156efee0-2e44-4494-9c13-baef0c5e45b8/operator/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.365730 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-hjkhm_552d8865-34b1-41e2-b755-becdee67efef/kube-rbac-proxy/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.426640 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-hjkhm_552d8865-34b1-41e2-b755-becdee67efef/manager/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.503893 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-2jc2q_9e7ad814-5207-4f51-b499-537c23a9d8b2/kube-rbac-proxy/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.646853 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-2jc2q_9e7ad814-5207-4f51-b499-537c23a9d8b2/manager/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.655199 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-z52jx_df2c5800-14f3-4112-a09c-31b0b75792d6/kube-rbac-proxy/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.702961 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-z52jx_df2c5800-14f3-4112-a09c-31b0b75792d6/manager/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.847308 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-szrb6_f79855b6-658f-4526-9201-08d54f47c41d/manager/0.log" Oct 07 09:30:56 crc kubenswrapper[5025]: I1007 09:30:56.847890 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-szrb6_f79855b6-658f-4526-9201-08d54f47c41d/kube-rbac-proxy/0.log" Oct 07 09:31:15 crc kubenswrapper[5025]: I1007 09:31:15.483061 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8lwpw_109ab8c4-3dcd-49a9-a966-4ad68758f46a/control-plane-machine-set-operator/0.log" Oct 07 09:31:15 crc kubenswrapper[5025]: I1007 09:31:15.660149 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cxbbt_1e23d4b6-3f5b-4288-8753-cff09258a821/kube-rbac-proxy/0.log" Oct 07 09:31:15 crc kubenswrapper[5025]: I1007 09:31:15.682955 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cxbbt_1e23d4b6-3f5b-4288-8753-cff09258a821/machine-api-operator/0.log" Oct 07 09:31:25 crc kubenswrapper[5025]: I1007 09:31:25.934292 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:31:25 crc kubenswrapper[5025]: I1007 09:31:25.935288 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:31:29 crc kubenswrapper[5025]: I1007 09:31:29.251469 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-2rl67_9bcb693f-b7c9-4e81-be45-ba0196498e60/cert-manager-controller/0.log" Oct 07 09:31:29 crc kubenswrapper[5025]: I1007 09:31:29.442226 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-nxrz7_dd36c502-6bcb-4bc0-9761-b0cd2f2f6aa2/cert-manager-cainjector/0.log" Oct 07 09:31:29 crc kubenswrapper[5025]: I1007 09:31:29.565116 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-rbg4h_56c29b08-1f19-4338-a43e-fc49b860b93b/cert-manager-webhook/0.log" Oct 07 09:31:43 crc kubenswrapper[5025]: I1007 09:31:43.096377 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-m2scr_a6af412f-9cd9-468b-b656-25fa3a52f24b/nmstate-console-plugin/0.log" Oct 07 09:31:43 crc kubenswrapper[5025]: I1007 09:31:43.297109 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-z4897_a0279a96-994a-4bc6-b4e1-c53e4efb08d6/nmstate-handler/0.log" Oct 07 09:31:43 crc kubenswrapper[5025]: I1007 09:31:43.339989 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-lw65t_aa3b53d3-e235-4259-9bc4-faa4257df0b7/kube-rbac-proxy/0.log" Oct 07 09:31:43 crc kubenswrapper[5025]: I1007 09:31:43.420461 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-lw65t_aa3b53d3-e235-4259-9bc4-faa4257df0b7/nmstate-metrics/0.log" Oct 07 09:31:43 crc kubenswrapper[5025]: I1007 09:31:43.532699 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-rtt98_4ae6ad94-b065-4bd8-a19c-50adb890e53a/nmstate-operator/0.log" Oct 07 09:31:43 crc kubenswrapper[5025]: I1007 09:31:43.630341 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-6jtt9_9a5212ab-0cf6-45f9-9369-3cc46265ac46/nmstate-webhook/0.log" Oct 07 09:31:55 crc kubenswrapper[5025]: I1007 09:31:55.934651 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:31:55 crc kubenswrapper[5025]: I1007 09:31:55.935488 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:32:00 crc kubenswrapper[5025]: I1007 09:32:00.528159 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7h8g2_2a67364e-1ca0-4727-8650-1d1fdcfd0259/kube-rbac-proxy/0.log" Oct 07 09:32:00 crc kubenswrapper[5025]: I1007 09:32:00.704709 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-frr-files/0.log" Oct 07 09:32:00 crc kubenswrapper[5025]: I1007 09:32:00.865704 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7h8g2_2a67364e-1ca0-4727-8650-1d1fdcfd0259/controller/0.log" Oct 07 09:32:01 crc kubenswrapper[5025]: I1007 09:32:01.036043 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-reloader/0.log" Oct 07 09:32:01 crc kubenswrapper[5025]: I1007 09:32:01.133555 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-frr-files/0.log" Oct 07 09:32:01 crc kubenswrapper[5025]: I1007 09:32:01.264959 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-metrics/0.log" Oct 07 09:32:01 crc kubenswrapper[5025]: I1007 09:32:01.320349 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-reloader/0.log" Oct 07 09:32:01 crc kubenswrapper[5025]: I1007 09:32:01.523722 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-frr-files/0.log" Oct 07 09:32:01 crc kubenswrapper[5025]: I1007 09:32:01.532893 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-metrics/0.log" Oct 07 09:32:01 crc kubenswrapper[5025]: I1007 09:32:01.579438 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-reloader/0.log" Oct 07 09:32:01 crc kubenswrapper[5025]: I1007 09:32:01.598914 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-metrics/0.log" Oct 07 09:32:02 crc kubenswrapper[5025]: I1007 09:32:02.162835 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-frr-files/0.log" Oct 07 09:32:02 crc kubenswrapper[5025]: I1007 09:32:02.163667 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/controller/0.log" Oct 07 09:32:02 crc kubenswrapper[5025]: I1007 09:32:02.166040 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-reloader/0.log" Oct 07 09:32:02 crc kubenswrapper[5025]: I1007 09:32:02.220584 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/cp-metrics/0.log" Oct 07 09:32:02 crc kubenswrapper[5025]: I1007 09:32:02.379088 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/frr-metrics/0.log" Oct 07 09:32:02 crc kubenswrapper[5025]: I1007 09:32:02.402978 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/kube-rbac-proxy/0.log" Oct 07 09:32:02 crc kubenswrapper[5025]: I1007 09:32:02.468237 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/kube-rbac-proxy-frr/0.log" Oct 07 09:32:02 crc kubenswrapper[5025]: I1007 09:32:02.687115 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/reloader/0.log" Oct 07 09:32:02 crc kubenswrapper[5025]: I1007 09:32:02.766092 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-48bhp_027820a1-f099-4452-960d-b9d33d3eb48f/frr-k8s-webhook-server/0.log" Oct 07 09:32:03 crc kubenswrapper[5025]: I1007 09:32:03.024101 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-668569d786-l8897_95984a2b-2cba-401a-9b57-0ded2af5b4c8/manager/0.log" Oct 07 09:32:03 crc kubenswrapper[5025]: I1007 09:32:03.127328 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-68dd66dd7d-xmmpg_c884f9d5-90bb-4ff6-857f-6ffc9d973d7a/webhook-server/0.log" Oct 07 09:32:03 crc kubenswrapper[5025]: I1007 09:32:03.303204 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mmddh_28a4236b-41fc-4aba-8592-0b055eff1685/kube-rbac-proxy/0.log" Oct 07 09:32:03 crc kubenswrapper[5025]: I1007 09:32:03.639934 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6z6lt_5a2e702e-d565-4f63-a7cf-21465cf8d4fa/frr/0.log" Oct 07 09:32:04 crc kubenswrapper[5025]: I1007 09:32:04.155169 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mmddh_28a4236b-41fc-4aba-8592-0b055eff1685/speaker/0.log" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.092930 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9nkcj"] Oct 07 09:32:10 crc kubenswrapper[5025]: E1007 09:32:10.093862 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac522518-8445-44fc-837f-c5a9def13a7d" containerName="collect-profiles" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.093882 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac522518-8445-44fc-837f-c5a9def13a7d" containerName="collect-profiles" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.094095 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac522518-8445-44fc-837f-c5a9def13a7d" containerName="collect-profiles" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.095496 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.127034 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9nkcj"] Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.193820 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-catalog-content\") pod \"community-operators-9nkcj\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.193882 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8ssp\" (UniqueName: \"kubernetes.io/projected/3dacab31-3a83-450d-a41d-e384fcfa2d24-kube-api-access-n8ssp\") pod \"community-operators-9nkcj\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.193945 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-utilities\") pod \"community-operators-9nkcj\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.295758 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-catalog-content\") pod \"community-operators-9nkcj\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.295826 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8ssp\" (UniqueName: \"kubernetes.io/projected/3dacab31-3a83-450d-a41d-e384fcfa2d24-kube-api-access-n8ssp\") pod \"community-operators-9nkcj\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.295878 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-utilities\") pod \"community-operators-9nkcj\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.296392 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-utilities\") pod \"community-operators-9nkcj\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.296641 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-catalog-content\") pod \"community-operators-9nkcj\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.333053 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8ssp\" (UniqueName: \"kubernetes.io/projected/3dacab31-3a83-450d-a41d-e384fcfa2d24-kube-api-access-n8ssp\") pod \"community-operators-9nkcj\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:10 crc kubenswrapper[5025]: I1007 09:32:10.416199 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:11 crc kubenswrapper[5025]: I1007 09:32:11.001785 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9nkcj"] Oct 07 09:32:12 crc kubenswrapper[5025]: I1007 09:32:12.021768 5025 generic.go:334] "Generic (PLEG): container finished" podID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerID="1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e" exitCode=0 Oct 07 09:32:12 crc kubenswrapper[5025]: I1007 09:32:12.021842 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nkcj" event={"ID":"3dacab31-3a83-450d-a41d-e384fcfa2d24","Type":"ContainerDied","Data":"1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e"} Oct 07 09:32:12 crc kubenswrapper[5025]: I1007 09:32:12.022610 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nkcj" event={"ID":"3dacab31-3a83-450d-a41d-e384fcfa2d24","Type":"ContainerStarted","Data":"05386e36e116c324db2e81cb3d94e42f932e9befa5e73178b45f9cb906452d4a"} Oct 07 09:32:13 crc kubenswrapper[5025]: I1007 09:32:13.034678 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nkcj" event={"ID":"3dacab31-3a83-450d-a41d-e384fcfa2d24","Type":"ContainerStarted","Data":"22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c"} Oct 07 09:32:14 crc kubenswrapper[5025]: I1007 09:32:14.044872 5025 generic.go:334] "Generic (PLEG): container finished" podID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerID="22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c" exitCode=0 Oct 07 09:32:14 crc kubenswrapper[5025]: I1007 09:32:14.044996 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nkcj" event={"ID":"3dacab31-3a83-450d-a41d-e384fcfa2d24","Type":"ContainerDied","Data":"22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c"} Oct 07 09:32:15 crc kubenswrapper[5025]: I1007 09:32:15.057268 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nkcj" event={"ID":"3dacab31-3a83-450d-a41d-e384fcfa2d24","Type":"ContainerStarted","Data":"4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd"} Oct 07 09:32:15 crc kubenswrapper[5025]: I1007 09:32:15.080338 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9nkcj" podStartSLOduration=2.366069663 podStartE2EDuration="5.080311774s" podCreationTimestamp="2025-10-07 09:32:10 +0000 UTC" firstStartedPulling="2025-10-07 09:32:12.024748571 +0000 UTC m=+4538.834062715" lastFinishedPulling="2025-10-07 09:32:14.738990682 +0000 UTC m=+4541.548304826" observedRunningTime="2025-10-07 09:32:15.075860113 +0000 UTC m=+4541.885174257" watchObservedRunningTime="2025-10-07 09:32:15.080311774 +0000 UTC m=+4541.889625928" Oct 07 09:32:17 crc kubenswrapper[5025]: I1007 09:32:17.411615 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7_fd984570-be38-4ee8-b94d-be13506a255c/util/0.log" Oct 07 09:32:17 crc kubenswrapper[5025]: I1007 09:32:17.615583 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7_fd984570-be38-4ee8-b94d-be13506a255c/util/0.log" Oct 07 09:32:17 crc kubenswrapper[5025]: I1007 09:32:17.639362 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7_fd984570-be38-4ee8-b94d-be13506a255c/pull/0.log" Oct 07 09:32:17 crc kubenswrapper[5025]: I1007 09:32:17.644839 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7_fd984570-be38-4ee8-b94d-be13506a255c/pull/0.log" Oct 07 09:32:17 crc kubenswrapper[5025]: I1007 09:32:17.788258 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7_fd984570-be38-4ee8-b94d-be13506a255c/util/0.log" Oct 07 09:32:17 crc kubenswrapper[5025]: I1007 09:32:17.840994 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7_fd984570-be38-4ee8-b94d-be13506a255c/extract/0.log" Oct 07 09:32:17 crc kubenswrapper[5025]: I1007 09:32:17.861967 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69jgvp7_fd984570-be38-4ee8-b94d-be13506a255c/pull/0.log" Oct 07 09:32:17 crc kubenswrapper[5025]: I1007 09:32:17.982295 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7_a5712d23-aa3f-49ea-b448-af44d79c9701/util/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.170300 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7_a5712d23-aa3f-49ea-b448-af44d79c9701/util/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.172566 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7_a5712d23-aa3f-49ea-b448-af44d79c9701/pull/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.200755 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7_a5712d23-aa3f-49ea-b448-af44d79c9701/pull/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.432169 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7_a5712d23-aa3f-49ea-b448-af44d79c9701/util/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.483517 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7_a5712d23-aa3f-49ea-b448-af44d79c9701/pull/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.498234 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22zqf7_a5712d23-aa3f-49ea-b448-af44d79c9701/extract/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.604399 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j6rc5_7c7c5ec1-fa50-4cf4-a416-233a0efd845e/extract-utilities/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.796689 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j6rc5_7c7c5ec1-fa50-4cf4-a416-233a0efd845e/extract-utilities/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.811176 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j6rc5_7c7c5ec1-fa50-4cf4-a416-233a0efd845e/extract-content/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.818963 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j6rc5_7c7c5ec1-fa50-4cf4-a416-233a0efd845e/extract-content/0.log" Oct 07 09:32:18 crc kubenswrapper[5025]: I1007 09:32:18.991288 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j6rc5_7c7c5ec1-fa50-4cf4-a416-233a0efd845e/extract-content/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.014730 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j6rc5_7c7c5ec1-fa50-4cf4-a416-233a0efd845e/extract-utilities/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.185043 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j6rc5_7c7c5ec1-fa50-4cf4-a416-233a0efd845e/registry-server/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.223940 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9nkcj_3dacab31-3a83-450d-a41d-e384fcfa2d24/extract-utilities/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.393138 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9nkcj_3dacab31-3a83-450d-a41d-e384fcfa2d24/extract-content/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.433367 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9nkcj_3dacab31-3a83-450d-a41d-e384fcfa2d24/extract-content/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.456725 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9nkcj_3dacab31-3a83-450d-a41d-e384fcfa2d24/extract-utilities/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.655029 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9nkcj_3dacab31-3a83-450d-a41d-e384fcfa2d24/extract-content/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.656902 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9nkcj_3dacab31-3a83-450d-a41d-e384fcfa2d24/extract-utilities/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.660092 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9nkcj_3dacab31-3a83-450d-a41d-e384fcfa2d24/registry-server/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.815197 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pz4zg_8afd374e-c526-433f-81ca-9c81457e7591/extract-utilities/0.log" Oct 07 09:32:19 crc kubenswrapper[5025]: I1007 09:32:19.997797 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pz4zg_8afd374e-c526-433f-81ca-9c81457e7591/extract-utilities/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.003278 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pz4zg_8afd374e-c526-433f-81ca-9c81457e7591/extract-content/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.022086 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pz4zg_8afd374e-c526-433f-81ca-9c81457e7591/extract-content/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.233508 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pz4zg_8afd374e-c526-433f-81ca-9c81457e7591/extract-content/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.239782 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pz4zg_8afd374e-c526-433f-81ca-9c81457e7591/extract-utilities/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.398214 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454/util/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.416508 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.417901 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.471255 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.582164 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454/util/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.620589 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454/pull/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.667676 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454/pull/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.811023 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454/pull/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.854372 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454/util/0.log" Oct 07 09:32:20 crc kubenswrapper[5025]: I1007 09:32:20.915934 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7kh6x_e05bb0ed-d1b1-43f8-bb3f-b1b7e9264454/extract/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.095931 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-d4bw5_d57da756-f579-4d68-b775-8788fad75582/marketplace-operator/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.151013 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pz4zg_8afd374e-c526-433f-81ca-9c81457e7591/registry-server/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.159648 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.204411 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wjv4n_61e2aa02-6590-49e6-a1e9-f9e22e01a679/extract-utilities/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.233272 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9nkcj"] Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.401762 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wjv4n_61e2aa02-6590-49e6-a1e9-f9e22e01a679/extract-utilities/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.443669 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wjv4n_61e2aa02-6590-49e6-a1e9-f9e22e01a679/extract-content/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.444251 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wjv4n_61e2aa02-6590-49e6-a1e9-f9e22e01a679/extract-content/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.592054 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wjv4n_61e2aa02-6590-49e6-a1e9-f9e22e01a679/extract-utilities/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.592253 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wjv4n_61e2aa02-6590-49e6-a1e9-f9e22e01a679/extract-content/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.723655 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kldq9_dc73e260-8654-4431-b0dc-6d0347778384/extract-utilities/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.833877 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wjv4n_61e2aa02-6590-49e6-a1e9-f9e22e01a679/registry-server/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.949286 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kldq9_dc73e260-8654-4431-b0dc-6d0347778384/extract-content/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.981635 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kldq9_dc73e260-8654-4431-b0dc-6d0347778384/extract-utilities/0.log" Oct 07 09:32:21 crc kubenswrapper[5025]: I1007 09:32:21.981689 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kldq9_dc73e260-8654-4431-b0dc-6d0347778384/extract-content/0.log" Oct 07 09:32:22 crc kubenswrapper[5025]: I1007 09:32:22.123287 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kldq9_dc73e260-8654-4431-b0dc-6d0347778384/extract-utilities/0.log" Oct 07 09:32:22 crc kubenswrapper[5025]: I1007 09:32:22.149965 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kldq9_dc73e260-8654-4431-b0dc-6d0347778384/extract-content/0.log" Oct 07 09:32:22 crc kubenswrapper[5025]: I1007 09:32:22.355552 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kldq9_dc73e260-8654-4431-b0dc-6d0347778384/registry-server/0.log" Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.121964 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9nkcj" podUID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerName="registry-server" containerID="cri-o://4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd" gracePeriod=2 Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.571864 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.738123 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8ssp\" (UniqueName: \"kubernetes.io/projected/3dacab31-3a83-450d-a41d-e384fcfa2d24-kube-api-access-n8ssp\") pod \"3dacab31-3a83-450d-a41d-e384fcfa2d24\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.738201 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-catalog-content\") pod \"3dacab31-3a83-450d-a41d-e384fcfa2d24\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.738324 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-utilities\") pod \"3dacab31-3a83-450d-a41d-e384fcfa2d24\" (UID: \"3dacab31-3a83-450d-a41d-e384fcfa2d24\") " Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.739850 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-utilities" (OuterVolumeSpecName: "utilities") pod "3dacab31-3a83-450d-a41d-e384fcfa2d24" (UID: "3dacab31-3a83-450d-a41d-e384fcfa2d24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.762262 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dacab31-3a83-450d-a41d-e384fcfa2d24-kube-api-access-n8ssp" (OuterVolumeSpecName: "kube-api-access-n8ssp") pod "3dacab31-3a83-450d-a41d-e384fcfa2d24" (UID: "3dacab31-3a83-450d-a41d-e384fcfa2d24"). InnerVolumeSpecName "kube-api-access-n8ssp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.841111 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8ssp\" (UniqueName: \"kubernetes.io/projected/3dacab31-3a83-450d-a41d-e384fcfa2d24-kube-api-access-n8ssp\") on node \"crc\" DevicePath \"\"" Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.841614 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:32:23 crc kubenswrapper[5025]: I1007 09:32:23.949523 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3dacab31-3a83-450d-a41d-e384fcfa2d24" (UID: "3dacab31-3a83-450d-a41d-e384fcfa2d24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.045673 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dacab31-3a83-450d-a41d-e384fcfa2d24-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.136584 5025 generic.go:334] "Generic (PLEG): container finished" podID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerID="4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd" exitCode=0 Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.136653 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nkcj" event={"ID":"3dacab31-3a83-450d-a41d-e384fcfa2d24","Type":"ContainerDied","Data":"4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd"} Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.136689 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nkcj" event={"ID":"3dacab31-3a83-450d-a41d-e384fcfa2d24","Type":"ContainerDied","Data":"05386e36e116c324db2e81cb3d94e42f932e9befa5e73178b45f9cb906452d4a"} Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.136711 5025 scope.go:117] "RemoveContainer" containerID="4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.136751 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9nkcj" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.165680 5025 scope.go:117] "RemoveContainer" containerID="22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.187083 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9nkcj"] Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.193419 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9nkcj"] Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.219775 5025 scope.go:117] "RemoveContainer" containerID="1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.239836 5025 scope.go:117] "RemoveContainer" containerID="4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd" Oct 07 09:32:24 crc kubenswrapper[5025]: E1007 09:32:24.240488 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd\": container with ID starting with 4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd not found: ID does not exist" containerID="4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.240622 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd"} err="failed to get container status \"4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd\": rpc error: code = NotFound desc = could not find container \"4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd\": container with ID starting with 4c510bd5450192b34344e074f0a19b5c9dfe49d95e4ec28047c53be60f7888fd not found: ID does not exist" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.240728 5025 scope.go:117] "RemoveContainer" containerID="22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c" Oct 07 09:32:24 crc kubenswrapper[5025]: E1007 09:32:24.241283 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c\": container with ID starting with 22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c not found: ID does not exist" containerID="22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.241305 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c"} err="failed to get container status \"22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c\": rpc error: code = NotFound desc = could not find container \"22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c\": container with ID starting with 22c9ea6596464cd5c961761f186c6e8ade0d39b0e487cb5538b1c12c69e02f4c not found: ID does not exist" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.241318 5025 scope.go:117] "RemoveContainer" containerID="1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e" Oct 07 09:32:24 crc kubenswrapper[5025]: E1007 09:32:24.241642 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e\": container with ID starting with 1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e not found: ID does not exist" containerID="1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e" Oct 07 09:32:24 crc kubenswrapper[5025]: I1007 09:32:24.241744 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e"} err="failed to get container status \"1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e\": rpc error: code = NotFound desc = could not find container \"1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e\": container with ID starting with 1ff10e593bf1c98392f8f2f168f71d71c1b500cc129ddec9ae077bf5675d691e not found: ID does not exist" Oct 07 09:32:25 crc kubenswrapper[5025]: I1007 09:32:25.926930 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dacab31-3a83-450d-a41d-e384fcfa2d24" path="/var/lib/kubelet/pods/3dacab31-3a83-450d-a41d-e384fcfa2d24/volumes" Oct 07 09:32:25 crc kubenswrapper[5025]: I1007 09:32:25.934203 5025 patch_prober.go:28] interesting pod/machine-config-daemon-2dj2t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 09:32:25 crc kubenswrapper[5025]: I1007 09:32:25.934293 5025 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 09:32:25 crc kubenswrapper[5025]: I1007 09:32:25.934362 5025 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" Oct 07 09:32:25 crc kubenswrapper[5025]: I1007 09:32:25.935374 5025 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e"} pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 09:32:25 crc kubenswrapper[5025]: I1007 09:32:25.935483 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerName="machine-config-daemon" containerID="cri-o://652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" gracePeriod=600 Oct 07 09:32:26 crc kubenswrapper[5025]: E1007 09:32:26.070595 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:32:26 crc kubenswrapper[5025]: I1007 09:32:26.166772 5025 generic.go:334] "Generic (PLEG): container finished" podID="b4849c41-22e1-400e-8e11-096da49ef1b2" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" exitCode=0 Oct 07 09:32:26 crc kubenswrapper[5025]: I1007 09:32:26.166844 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" event={"ID":"b4849c41-22e1-400e-8e11-096da49ef1b2","Type":"ContainerDied","Data":"652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e"} Oct 07 09:32:26 crc kubenswrapper[5025]: I1007 09:32:26.166903 5025 scope.go:117] "RemoveContainer" containerID="39ee1466ed5d015cbd75b2c0cc144f31970050e10bccfa27cc9375a60cdb6572" Oct 07 09:32:26 crc kubenswrapper[5025]: I1007 09:32:26.168582 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:32:26 crc kubenswrapper[5025]: E1007 09:32:26.169113 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:32:37 crc kubenswrapper[5025]: I1007 09:32:37.915392 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:32:37 crc kubenswrapper[5025]: E1007 09:32:37.916941 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:32:51 crc kubenswrapper[5025]: I1007 09:32:51.915331 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:32:51 crc kubenswrapper[5025]: E1007 09:32:51.916521 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:33:06 crc kubenswrapper[5025]: I1007 09:33:06.915489 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:33:06 crc kubenswrapper[5025]: E1007 09:33:06.916620 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:33:21 crc kubenswrapper[5025]: I1007 09:33:21.914768 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:33:21 crc kubenswrapper[5025]: E1007 09:33:21.917127 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:33:31 crc kubenswrapper[5025]: I1007 09:33:31.751026 5025 generic.go:334] "Generic (PLEG): container finished" podID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" containerID="5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919" exitCode=0 Oct 07 09:33:31 crc kubenswrapper[5025]: I1007 09:33:31.751575 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jcjgq/must-gather-twh49" event={"ID":"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61","Type":"ContainerDied","Data":"5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919"} Oct 07 09:33:31 crc kubenswrapper[5025]: I1007 09:33:31.752236 5025 scope.go:117] "RemoveContainer" containerID="5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919" Oct 07 09:33:32 crc kubenswrapper[5025]: I1007 09:33:32.621223 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jcjgq_must-gather-twh49_6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61/gather/0.log" Oct 07 09:33:33 crc kubenswrapper[5025]: I1007 09:33:33.926877 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:33:33 crc kubenswrapper[5025]: E1007 09:33:33.927290 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.169720 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jcjgq/must-gather-twh49"] Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.170970 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jcjgq/must-gather-twh49" podUID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" containerName="copy" containerID="cri-o://c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80" gracePeriod=2 Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.178593 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jcjgq/must-gather-twh49"] Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.594631 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jcjgq_must-gather-twh49_6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61/copy/0.log" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.595517 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.713811 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-must-gather-output\") pod \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\" (UID: \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\") " Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.714043 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwww9\" (UniqueName: \"kubernetes.io/projected/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-kube-api-access-rwww9\") pod \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\" (UID: \"6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61\") " Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.721932 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-kube-api-access-rwww9" (OuterVolumeSpecName: "kube-api-access-rwww9") pod "6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" (UID: "6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61"). InnerVolumeSpecName "kube-api-access-rwww9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.803310 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" (UID: "6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.816140 5025 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.816190 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwww9\" (UniqueName: \"kubernetes.io/projected/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61-kube-api-access-rwww9\") on node \"crc\" DevicePath \"\"" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.843358 5025 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jcjgq_must-gather-twh49_6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61/copy/0.log" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.843754 5025 generic.go:334] "Generic (PLEG): container finished" podID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" containerID="c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80" exitCode=143 Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.843831 5025 scope.go:117] "RemoveContainer" containerID="c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.844007 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jcjgq/must-gather-twh49" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.881076 5025 scope.go:117] "RemoveContainer" containerID="5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.950671 5025 scope.go:117] "RemoveContainer" containerID="c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80" Oct 07 09:33:40 crc kubenswrapper[5025]: E1007 09:33:40.951438 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80\": container with ID starting with c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80 not found: ID does not exist" containerID="c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.951511 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80"} err="failed to get container status \"c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80\": rpc error: code = NotFound desc = could not find container \"c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80\": container with ID starting with c36a85c161cfca644eafb62528c3e9e3f49bee8e73e307165f009f410a69ed80 not found: ID does not exist" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.951605 5025 scope.go:117] "RemoveContainer" containerID="5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919" Oct 07 09:33:40 crc kubenswrapper[5025]: E1007 09:33:40.952261 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919\": container with ID starting with 5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919 not found: ID does not exist" containerID="5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919" Oct 07 09:33:40 crc kubenswrapper[5025]: I1007 09:33:40.952299 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919"} err="failed to get container status \"5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919\": rpc error: code = NotFound desc = could not find container \"5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919\": container with ID starting with 5509f4e6e7968d21bbc04b8b22697ac5b524de0c990f21a18662fc5dfcdfc919 not found: ID does not exist" Oct 07 09:33:41 crc kubenswrapper[5025]: E1007 09:33:41.023865 5025 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fd7c059_3b7a_4cdd_9651_9efb4bc5ce61.slice/crio-e3f4bbc16c9f6acf2c857497585d6b4afcce9552ddba22e5813db91fe192e64f\": RecentStats: unable to find data in memory cache]" Oct 07 09:33:41 crc kubenswrapper[5025]: I1007 09:33:41.934135 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" path="/var/lib/kubelet/pods/6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61/volumes" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.679410 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jl8cm"] Oct 07 09:33:43 crc kubenswrapper[5025]: E1007 09:33:43.679842 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerName="extract-utilities" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.679861 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerName="extract-utilities" Oct 07 09:33:43 crc kubenswrapper[5025]: E1007 09:33:43.679883 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" containerName="gather" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.679891 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" containerName="gather" Oct 07 09:33:43 crc kubenswrapper[5025]: E1007 09:33:43.679915 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" containerName="copy" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.679925 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" containerName="copy" Oct 07 09:33:43 crc kubenswrapper[5025]: E1007 09:33:43.679965 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerName="registry-server" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.679974 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerName="registry-server" Oct 07 09:33:43 crc kubenswrapper[5025]: E1007 09:33:43.679988 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerName="extract-content" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.679997 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerName="extract-content" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.680180 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" containerName="copy" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.680200 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd7c059-3b7a-4cdd-9651-9efb4bc5ce61" containerName="gather" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.680220 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dacab31-3a83-450d-a41d-e384fcfa2d24" containerName="registry-server" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.682363 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.703625 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jl8cm"] Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.865858 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-catalog-content\") pod \"certified-operators-jl8cm\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.865975 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctd8m\" (UniqueName: \"kubernetes.io/projected/674d9eb6-c3f8-4855-9669-f3d4bc812936-kube-api-access-ctd8m\") pod \"certified-operators-jl8cm\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.866025 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-utilities\") pod \"certified-operators-jl8cm\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.967376 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctd8m\" (UniqueName: \"kubernetes.io/projected/674d9eb6-c3f8-4855-9669-f3d4bc812936-kube-api-access-ctd8m\") pod \"certified-operators-jl8cm\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.967455 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-utilities\") pod \"certified-operators-jl8cm\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.967500 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-catalog-content\") pod \"certified-operators-jl8cm\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.968254 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-catalog-content\") pod \"certified-operators-jl8cm\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.968929 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-utilities\") pod \"certified-operators-jl8cm\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:43 crc kubenswrapper[5025]: I1007 09:33:43.991318 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctd8m\" (UniqueName: \"kubernetes.io/projected/674d9eb6-c3f8-4855-9669-f3d4bc812936-kube-api-access-ctd8m\") pod \"certified-operators-jl8cm\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:44 crc kubenswrapper[5025]: I1007 09:33:44.005763 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:44 crc kubenswrapper[5025]: I1007 09:33:44.572133 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jl8cm"] Oct 07 09:33:44 crc kubenswrapper[5025]: I1007 09:33:44.880329 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jl8cm" event={"ID":"674d9eb6-c3f8-4855-9669-f3d4bc812936","Type":"ContainerStarted","Data":"40bd0bcf55fccf3b3ff5ee11634c66370de2f1741c3408065ff5fb52679d1424"} Oct 07 09:33:45 crc kubenswrapper[5025]: I1007 09:33:45.892226 5025 generic.go:334] "Generic (PLEG): container finished" podID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerID="233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e" exitCode=0 Oct 07 09:33:45 crc kubenswrapper[5025]: I1007 09:33:45.892297 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jl8cm" event={"ID":"674d9eb6-c3f8-4855-9669-f3d4bc812936","Type":"ContainerDied","Data":"233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e"} Oct 07 09:33:45 crc kubenswrapper[5025]: I1007 09:33:45.916023 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:33:45 crc kubenswrapper[5025]: E1007 09:33:45.916869 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:33:46 crc kubenswrapper[5025]: I1007 09:33:46.903218 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jl8cm" event={"ID":"674d9eb6-c3f8-4855-9669-f3d4bc812936","Type":"ContainerStarted","Data":"00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f"} Oct 07 09:33:47 crc kubenswrapper[5025]: I1007 09:33:47.914662 5025 generic.go:334] "Generic (PLEG): container finished" podID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerID="00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f" exitCode=0 Oct 07 09:33:47 crc kubenswrapper[5025]: I1007 09:33:47.935710 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jl8cm" event={"ID":"674d9eb6-c3f8-4855-9669-f3d4bc812936","Type":"ContainerDied","Data":"00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f"} Oct 07 09:33:48 crc kubenswrapper[5025]: I1007 09:33:48.924234 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jl8cm" event={"ID":"674d9eb6-c3f8-4855-9669-f3d4bc812936","Type":"ContainerStarted","Data":"9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e"} Oct 07 09:33:48 crc kubenswrapper[5025]: I1007 09:33:48.959495 5025 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jl8cm" podStartSLOduration=3.25777188 podStartE2EDuration="5.959464746s" podCreationTimestamp="2025-10-07 09:33:43 +0000 UTC" firstStartedPulling="2025-10-07 09:33:45.896128847 +0000 UTC m=+4632.705442991" lastFinishedPulling="2025-10-07 09:33:48.597821713 +0000 UTC m=+4635.407135857" observedRunningTime="2025-10-07 09:33:48.951613478 +0000 UTC m=+4635.760927622" watchObservedRunningTime="2025-10-07 09:33:48.959464746 +0000 UTC m=+4635.768778890" Oct 07 09:33:54 crc kubenswrapper[5025]: I1007 09:33:54.007841 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:54 crc kubenswrapper[5025]: I1007 09:33:54.010728 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:54 crc kubenswrapper[5025]: I1007 09:33:54.068699 5025 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:55 crc kubenswrapper[5025]: I1007 09:33:55.040030 5025 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:55 crc kubenswrapper[5025]: I1007 09:33:55.109655 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jl8cm"] Oct 07 09:33:56 crc kubenswrapper[5025]: I1007 09:33:56.997100 5025 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jl8cm" podUID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerName="registry-server" containerID="cri-o://9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e" gracePeriod=2 Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.662584 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.828268 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-utilities\") pod \"674d9eb6-c3f8-4855-9669-f3d4bc812936\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.828424 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctd8m\" (UniqueName: \"kubernetes.io/projected/674d9eb6-c3f8-4855-9669-f3d4bc812936-kube-api-access-ctd8m\") pod \"674d9eb6-c3f8-4855-9669-f3d4bc812936\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.828502 5025 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-catalog-content\") pod \"674d9eb6-c3f8-4855-9669-f3d4bc812936\" (UID: \"674d9eb6-c3f8-4855-9669-f3d4bc812936\") " Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.830281 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-utilities" (OuterVolumeSpecName: "utilities") pod "674d9eb6-c3f8-4855-9669-f3d4bc812936" (UID: "674d9eb6-c3f8-4855-9669-f3d4bc812936"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.835907 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674d9eb6-c3f8-4855-9669-f3d4bc812936-kube-api-access-ctd8m" (OuterVolumeSpecName: "kube-api-access-ctd8m") pod "674d9eb6-c3f8-4855-9669-f3d4bc812936" (UID: "674d9eb6-c3f8-4855-9669-f3d4bc812936"). InnerVolumeSpecName "kube-api-access-ctd8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.886413 5025 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "674d9eb6-c3f8-4855-9669-f3d4bc812936" (UID: "674d9eb6-c3f8-4855-9669-f3d4bc812936"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.930694 5025 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.930728 5025 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/674d9eb6-c3f8-4855-9669-f3d4bc812936-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 09:33:57 crc kubenswrapper[5025]: I1007 09:33:57.930752 5025 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctd8m\" (UniqueName: \"kubernetes.io/projected/674d9eb6-c3f8-4855-9669-f3d4bc812936-kube-api-access-ctd8m\") on node \"crc\" DevicePath \"\"" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.011798 5025 generic.go:334] "Generic (PLEG): container finished" podID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerID="9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e" exitCode=0 Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.011880 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jl8cm" event={"ID":"674d9eb6-c3f8-4855-9669-f3d4bc812936","Type":"ContainerDied","Data":"9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e"} Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.011911 5025 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jl8cm" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.011946 5025 scope.go:117] "RemoveContainer" containerID="9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.011916 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jl8cm" event={"ID":"674d9eb6-c3f8-4855-9669-f3d4bc812936","Type":"ContainerDied","Data":"40bd0bcf55fccf3b3ff5ee11634c66370de2f1741c3408065ff5fb52679d1424"} Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.044007 5025 scope.go:117] "RemoveContainer" containerID="00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.048056 5025 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jl8cm"] Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.055320 5025 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jl8cm"] Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.076055 5025 scope.go:117] "RemoveContainer" containerID="233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.102530 5025 scope.go:117] "RemoveContainer" containerID="9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e" Oct 07 09:33:58 crc kubenswrapper[5025]: E1007 09:33:58.103169 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e\": container with ID starting with 9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e not found: ID does not exist" containerID="9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.103215 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e"} err="failed to get container status \"9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e\": rpc error: code = NotFound desc = could not find container \"9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e\": container with ID starting with 9ea037b9a78e0ef4db9afc082e8a16766035fadf818a03a6fcd095e29cf80e2e not found: ID does not exist" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.103254 5025 scope.go:117] "RemoveContainer" containerID="00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f" Oct 07 09:33:58 crc kubenswrapper[5025]: E1007 09:33:58.103640 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f\": container with ID starting with 00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f not found: ID does not exist" containerID="00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.103695 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f"} err="failed to get container status \"00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f\": rpc error: code = NotFound desc = could not find container \"00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f\": container with ID starting with 00b0c823cfdfaf6977a76f6393103ae38cc4274ad7c6c439c826d9309b7db00f not found: ID does not exist" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.103729 5025 scope.go:117] "RemoveContainer" containerID="233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e" Oct 07 09:33:58 crc kubenswrapper[5025]: E1007 09:33:58.104206 5025 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e\": container with ID starting with 233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e not found: ID does not exist" containerID="233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.104278 5025 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e"} err="failed to get container status \"233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e\": rpc error: code = NotFound desc = could not find container \"233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e\": container with ID starting with 233d6df086b1cdfe9bff56b942b855d1af9e15e142f84da614ff82d1dd3e5a3e not found: ID does not exist" Oct 07 09:33:58 crc kubenswrapper[5025]: I1007 09:33:58.915608 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:33:58 crc kubenswrapper[5025]: E1007 09:33:58.916267 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:33:59 crc kubenswrapper[5025]: I1007 09:33:59.931486 5025 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674d9eb6-c3f8-4855-9669-f3d4bc812936" path="/var/lib/kubelet/pods/674d9eb6-c3f8-4855-9669-f3d4bc812936/volumes" Oct 07 09:34:13 crc kubenswrapper[5025]: I1007 09:34:13.915233 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:34:13 crc kubenswrapper[5025]: E1007 09:34:13.916063 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:34:26 crc kubenswrapper[5025]: I1007 09:34:26.915230 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:34:26 crc kubenswrapper[5025]: E1007 09:34:26.916531 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:34:39 crc kubenswrapper[5025]: I1007 09:34:39.915080 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:34:39 crc kubenswrapper[5025]: E1007 09:34:39.916249 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:34:51 crc kubenswrapper[5025]: I1007 09:34:51.914598 5025 scope.go:117] "RemoveContainer" containerID="652ea836bd6bc45776b557dc80abd1f3393e92490a2ac86faa4fa940a2468c8e" Oct 07 09:34:51 crc kubenswrapper[5025]: E1007 09:34:51.915379 5025 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2dj2t_openshift-machine-config-operator(b4849c41-22e1-400e-8e11-096da49ef1b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-2dj2t" podUID="b4849c41-22e1-400e-8e11-096da49ef1b2" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.092739 5025 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zpdvd"] Oct 07 09:34:56 crc kubenswrapper[5025]: E1007 09:34:56.093794 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerName="extract-content" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.093839 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerName="extract-content" Oct 07 09:34:56 crc kubenswrapper[5025]: E1007 09:34:56.093885 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerName="registry-server" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.093895 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerName="registry-server" Oct 07 09:34:56 crc kubenswrapper[5025]: E1007 09:34:56.093921 5025 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerName="extract-utilities" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.093932 5025 state_mem.go:107] "Deleted CPUSet assignment" podUID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerName="extract-utilities" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.094138 5025 memory_manager.go:354] "RemoveStaleState removing state" podUID="674d9eb6-c3f8-4855-9669-f3d4bc812936" containerName="registry-server" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.095293 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.110328 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zpdvd"] Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.230287 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b-catalog-content\") pod \"redhat-marketplace-zpdvd\" (UID: \"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b\") " pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.230337 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6glp7\" (UniqueName: \"kubernetes.io/projected/bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b-kube-api-access-6glp7\") pod \"redhat-marketplace-zpdvd\" (UID: \"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b\") " pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.230635 5025 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b-utilities\") pod \"redhat-marketplace-zpdvd\" (UID: \"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b\") " pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.332344 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b-utilities\") pod \"redhat-marketplace-zpdvd\" (UID: \"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b\") " pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.332873 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b-catalog-content\") pod \"redhat-marketplace-zpdvd\" (UID: \"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b\") " pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.332896 5025 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6glp7\" (UniqueName: \"kubernetes.io/projected/bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b-kube-api-access-6glp7\") pod \"redhat-marketplace-zpdvd\" (UID: \"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b\") " pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.332940 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b-utilities\") pod \"redhat-marketplace-zpdvd\" (UID: \"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b\") " pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.333240 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b-catalog-content\") pod \"redhat-marketplace-zpdvd\" (UID: \"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b\") " pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.361327 5025 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6glp7\" (UniqueName: \"kubernetes.io/projected/bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b-kube-api-access-6glp7\") pod \"redhat-marketplace-zpdvd\" (UID: \"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b\") " pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.421737 5025 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zpdvd" Oct 07 09:34:56 crc kubenswrapper[5025]: I1007 09:34:56.947563 5025 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zpdvd"] Oct 07 09:34:57 crc kubenswrapper[5025]: I1007 09:34:57.616532 5025 generic.go:334] "Generic (PLEG): container finished" podID="bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b" containerID="527bb82642fa818113da504d0f62af2ad276f2c58b2505df5aa73fdea6e0f3ee" exitCode=0 Oct 07 09:34:57 crc kubenswrapper[5025]: I1007 09:34:57.616648 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zpdvd" event={"ID":"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b","Type":"ContainerDied","Data":"527bb82642fa818113da504d0f62af2ad276f2c58b2505df5aa73fdea6e0f3ee"} Oct 07 09:34:57 crc kubenswrapper[5025]: I1007 09:34:57.617069 5025 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zpdvd" event={"ID":"bdffac2f-ac9f-49b1-9b6b-0085ffa3e46b","Type":"ContainerStarted","Data":"24f15c24dea87896efe937729e824a00c3b53be299136df717da39dc5572bd5d"} Oct 07 09:34:57 crc kubenswrapper[5025]: I1007 09:34:57.620281 5025 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider